top of page

Research

What I Need To Know

Going into this project, I needed to know what the highest level of computer automation was, and how we could use it to improve our current systems. I’m interested in going into computer engineering as a career, so if I can learn how to properly automate computer systems now, I can apply that later in designs for future projects. Both my hobbies and my career path involve some form of computer engineering, so it would make a deeply positive impact on my life if I were to learn and become acquainted with computer automation early on in life.

What I Already Know Or Assume

My prior knowledge in computing is extensive, however the more I research a subject, the more I realize that I know less and less. I already know that automation of computer systems can be split into two major parts, hardware and software. Software is the part I will focus the most on, but hardware automation is also important. Most of my experience pertains to the software side of automation, with scripts and other programs that a computer can run to automate a task. I assume that complete automation is impossible, as all hardware is bound to fail eventually, and almost all software has some form of error or unintended side effect that would make it impossible to run indefinitely. Another one of my assumptions is that we will reach a point in the development and applications of automation that will only require one or two trained professionals to keep the system maintained, or to add on to the system as it needs. 

The Search For Information

My search for information was extensive, and has lasted the better part of this school year. Through online research, annotated bibliographies, scientific article reviews and critiques from other classes, and the largest part, my Internship at Forsyth County  Technology Services. There, I learned how a large, enterprise level computer network and management system was ran. I started off with the networking section, learning about how data travels from each computer, in each school, to each and every piece of hardware in the data center. This is also where the main security feature of the network is, which would be the firewall. The firewall is a very key part of keeping our school system's computers automated, because automation can only be stable in a controlled environment, so if there are ant viruses, malware, or other threat, the computers can not stay autonomous. The next area I went to was Systems Administration, which is where all of the configuration for the behaviors of each computer is set up. I started with the Macintosh side of management, and took that on as my personal project for a few months. There, I learned how to use a system called JAMF, which is an MDM (Mass Device Manager) to send out software and software packages to each computer, configuration profiles with settings, custom scripts, and diagnostics. With JAMF I learned all of the basics of systems administration, as well as some introductory scripting. When we needed to automate a task on the computer that was not built in to JAMF, we had to write a computer program to do exactly what we wanted. These are very complex, and take much longer to write and test that normal configuration profiles do. Once I had a good amount of experience with JAMF after running solo with it for a while, I started moving into the Microsoft area of administration. This software was called System Center, and was much more complex than JAMF. The application I used to interface with system center was System Center Configuration Manager, or SCCM. SCCM had much more built into it than JAMF did, so I didn't need to script anything outright. What I learned from using both of these pieces of software is that it is very necessary to have a human behind all of the controls, because many of the error we encountered would not have been solvable by a computer alone. After realizing this, I did some research on Artificial Intelligence and Machine Learning. These system are bleeding edge technology, and are being researched and developed constantly. Google, the popular tech company, has even gone as far as to make AutoML, an AI that has been taught to program more AI, and sometimes it's even better at programming them than we are. After seeing those, I was wondering how would be connect these systems together, and through that research I learned about virtual machines. Virtual Machines are another piece of software than can be run, but instead of doing one task, it acts as if it is a physical machine as well. Instead of many smaller machines running specific software, we can use one machine running many VM's, and each VM runs the specified software. I needed to know how we could control VMs in a more advanced way, instead of just acting as if it was an independent machine. This led me to interview someone from AT&T, who told me about how AT&T is moving to a purely VM based infrastructure. The key component to that development was called the Orchestrator, which controls all of the VMs' configurations and actions. 

What I Discovered

I discovered, in the end, that Computer Automation will always hit some sort of dead end. It's kind of like the child that always asks, "But Why?". In order to keep a computer system functioning, there always needs to be someone to fix those problems, or whenever something needs to be added to a computer, someone has to add it, and if something needs changing, someone needs to change it, and so on and so forth. With AI, we could create self-intelligent programs to fix problems and to add software or configurations, but what if that AI breaks? Do we train another AI to fix that? And what if that needs maintenance? At the end, there must always be a real person servicing the system, no matter what for. So I have come to the conclusion that we must have some happy medium between manually doing something on a computer, or creating an automated system for it. I think that we still have some distance to go before reaching that medium, but we may be closer than we think.

bottom of page