bits, bytes, and machine language
Quick Links
Genesis
Primitive Machines
Computers Become a Household Device
Energy Efficiency and Storage
Internet
Artificial Intelligence
Gaming
Programming Languages
Virtual Reality
Revelations
The Dreaded Robot Takeover
Human & Computer Similarities
Automobile & Computer Comparison
Image Gallery
Reality vs. Science Fiction
Debugging Buggy Code and Repairing Broken Hardware
Why Continue Developing this Field?
What about the Dangers?
Is there evolution for computers?
How to Fix Common PC Problems

Update: I repaired the image gallery for Computer History, although some of the images for the arrows don't work. The links are fine.

Genesis

Approximately 3,000 years ago Moses said let there be God. Yet, the Greeks said let there be computing machines. So one of the first computers was born, called the Antikythera mechanism. There was a dark age of computing for many years, but once again people said we need to make tasks easier.

At first, machines were built to make very simple and repetitive tasks. An abacus can be thought of as one of the first calculators. Made of wood with no electrical parts, circuits, processors, or memory yet still performs the function of calculating.

During the Industrial Revolution many machines were used for tasks. There are steam powered engines, wind powered windmills, and of course the many machines powered by fossil fuels.

Henry Ford was a pioneer in machine tasks. His Ford T-Bird was built by conveyor belts similar to those seen in supermarkets to this day. Paying employees up to $5 a day was a good salary at the time, a kind gesture, and much higher than most wages of the era.

However, the dawn of computing can be thought of as started by Alan Turing and Ada Lovelace. Turing, who thought of some of the initial rules for a computer to be artificially intelligent worked on some of first computers during World War 1^1. He suggested that for A.I. to exist a computer should be indistinguishable from a human. Yet, why shouldn't a computer use its advantage in math to help it? The Turing test has been a popular way for people to estimate the progress of A.I., even amidst criticism. In 2014, a chat-bot passed the Turing test by tricking a panel of judges that it was actually a Ukranian teen, who learned English as a second language. 33% of the judges were convinced it was human while the threshold to pass, according to Turing, is 30%. Despite this being a triumph it was a text-based chat, not yet up to par with the sci-fi film Blade Runner, where a layman police officer has trouble distinguishing between human and android.

Ada Lovelace, often mispronounced "loveless", wrote one of the first computer programs. Today, there is a programming language named after her. She was rumored to be a heroin user. Ada apparently had an affair outside the marriage with her royal husband. She died young - near age 30 from health complications.

Turing also passed away early. Chemically castrated for being homosexual - he was ostracized for his sexual orientation and thought to have committed suicide by poison near age 40. However, he received a royal pardon signed by Queen Elizabeth the second on December 24, 2013. This was only the fourth pardon granted after the conclusion of World War 2^1.

Primitive Machines

In 1822, there was a difference engine created by Charles Babbage, then the Analytical Engine made in 1837. In 1910, construction began on the five ton machine. Only 103 years ago from today.

The z1 or v1, as it was originally named, was considered the first programmable computer. In 1936, it had a clock speed of a whopping 1hz!

Later, there was the EBR-1 in 1951. Then, the AVIDAC in 1953.

These were prototypes unavailable to the public. Even the military created DARPA-prototype Internet was not initially available to the public.

Computers Become a Household Device

Although, Simon can be thought of as the first personal computer available for sale in 1950-51, it isn't very familiar with most computers around today that have a keyboard/mouse, monitor, desktop case, and speakers. More high priced machines were developed in the coming decades, some only producing hundreds of models or less. The Apple II was one of the first successful mass-produced computers available for purchase, shortly followed by IBM, which coined the term "PC". Many more brands began to branch out. Compaq. Hewlett-Packard. Dell. Samsung. Acer. Intel. IBM. Toshiba. This is hardware.

Operating systems are software. Let's review OS. OS-X Snow Leopard, Sun/Cisco, Windows, Linux/Solaris, Unix and its derivatives, flavors, MS-DOS are operating systems.

An OS is like the mind of the PC, which is different than the brain/processor. Memory stores information in large bits of data on different sectors of the hard drive or memory device. Operating systems began as text-only devices first, some only limited to 2 colors. Registers had a limit on the bit length. First 8-bit with only two colors then it advanced. Machines could render more colors. Images could be stored and created on computers. Video games could be made and played. A desktop-GUI environment began to dominate the computer world. Finally, people could use computers without complicated commands.

The 1980's and 1990's advanced computers far more than any other generation. Macintosh and Microsoft Windows were the most popular OS available to the consumer.

After MS-DOS and Unix came some of the prototypes for modern computing. Floppy disks were used to store removable memory and programs. At first barely able to hold more than a few images - now removable memory can store feature length films, music albums, thousands of photos, large video games and applications thousands to millions times larger in size than the entire hard drive a decade before.

Energy Efficiency and Storage

Throughout the 18-19th centuries several different forms of energy were used, before that other popular forms of energy were also in use, that were cleaner such as wind and steam. The 18-19th centuries used coal, and gas. These weren't limited to these centuries, but were polluted types of fuel.

Many types of units have been used to measure the input/output for appliances, and electrical equipment. Typically nuclear energy has been considered the most powerful type of energy available to power blocks, and cities, and it is quite powerful indeed, but the radiation from these power plants can be disastrous.

Fuel cells and voltage meters typically measure the electrical currency for most homes, apartment complexes, retirement communities, colleges, schools, business districts, etc.

A computer, P.C., quad-core, dual-core, or single-core, that can do one-ten tasks on less energy than a similar computer on the same amount of energy, would be considered more energy efficient than the first computer. Hence, the topic. However, remember that computers were designed to process between 1,000-1,000,000 tasks per second.


The move from tapes to disks was a large step in memory. It was safer and could hold more data. Floppy drives could only hold about 1.5mb of data, but a CD-Rom could store about 750mb. DVD's and Blu-rays are typically almost identical to CD-rom's but can store larger amounts of memory. They are still susceptible to scratches and corrupted data though, which means that the data is destroyed. Even hard drives mimic these types of storage material, despite moving towards chips in solid state drives. Even though solid state drives were on the market in the 1990s they began to become popular circa 2015. In comparison with traditional hard drives SSD's can access data up to 4x faster when connected via the SATA cable, and up to 10x the speed if inserted in the PCI slot or directly connected to the motherboard.

However, a brand-new type of storage is available nicknamed "Superman memory crystal". This 5D optical data storage can store up to 530TB (terabytes) of information that can theoretically last for thousands of years. It uses this by storing data on hard quartz crystals that cannot be easily damaged. This concept is still in development and not yet available to the general public, as of 2021. However, test samples were sent to Elon Musk, head of SpaceX, where he loaded it onto a Tesla Roadster, along with a dummy astronaut, and they launched it towards the sun in 2018.

In the DC comic Superman, they have computerized devices in Superman's ice cavern, that Kent uses to see old footage from Krypton, where he escaped before an extinction level event. Superman became popular with Christopher Reeves in the 70s, and lately has gained popularity with sci-fi movies "Justice League" and their television series "Krypton".

While some of the newest type of storage is Internet "cloud-based" architecture up to 1 terabyte (for paid plans), below is a table of traditional medium. If you do decide to get Cloud storage I recommend Dropbox, because that's all they do. As of April, 2021:

Storage/Memory Table
Company Amount Price Upgrade Available?
Dropbox Cloud Storage 2 Gb Free Yes
Microsoft OneDrive Cloud Storage 5 Gb Free Yes
Google Drive Cloud Storage 15 Gb Free Yes
WD Blue Internal SATA Hard Drive 1 Tb $40
WD Blue Internal SATA Solid State Drive 1 Tb $100
SanDisk USB Flash Drive 32 Gb $10

Internet

In this era as corporations launched new computers annually and researchers were developing new ways to improve every aspect of the machines there was also a boom in communications and networking. An entire office could share a printer or fax machine. But one of the largest communication methods was the Internet. Basically, it was TCP/IP file handshakes between computers and servers. Since socializing and file sharing is an essential part of the online experience the Internet began to expand quickly forcing countries and political regions to adapt by making new laws to try and regulate regular activities such as theft, business, freedoms, behavior, taxes, etc.

Accessing the Internet is all done through software, applications, and typically browsers. While the default for Windows (Internet Explorer) is the most popular for that platform, alternatives such as Google Chrome, Firefox (previously Netscape Navigator), Safari for Macintosh, and plenty of other lesser known browsers are available to search the web. While the earliest communication medium online was through IRC and bulletin boards (now called forums) the popularity and ease of GUI (graphic user interface) accelerated more media into browsers including images, audio, video, and occasionally live chat and feed sessions including audio and video.

While the Internet can be used as a tool to socialize with friends, listen to radio or music, read the news, and hear updates from celebrities or politicians it can also be used as a tool for illegal purposes. With Napster file-sharing became an easy way for people to share any file they want to anyone connected to the program all over the world. This meant that people could now share copyright protected material, and digitally steal music, video games, movies, pornography, books, software, and any type of digital files available. Huge profit losses forced lawyers to begin suing and arresting people through police and federal agencies to reduce the amount of digital theft that was going on. While generally viewed as harmless by the pirates, since everyone from senior citizens to small children were participating, a new wave of lawsuits showed everyone the theft was taken as seriously as in-store stealing. Piracy began to decline as more and more web sites and servers were shut down, however in countries outside of the United States piracy is not as enforced and most of the profit is lost there. Other decentralized p2p programs were attempted such as WinMX and Limewire.

USA Patriot Act. While government spying on its citizens and disregard for the Constitution had occured to feared terrorists before the Internet became popular by use of wiretapping, bugs, and other surveillance equipment the U.S. government began to write bills and laws and acts to allow law enforcement such as FBI, police, and others to view ISP logs, search engine queries, and online purchases to try and target suspected terrorists. The main criticism from the public was that not every law enforcement agency was following the law, getting warrants, and that this act infringed upon Constitutionally granted freedom, quoted by founding fathers such as Benjamin Franklin.

Not every use of the web was illegal, however, and its continued growth sparked sites such as Myspace, that revolutionized social networking, eBay, an online superbidding store, Twitter, for small tweets, WoW, an online gaming megacommunity, and Wikipedia - the giant encyclopedia edited by anyone. Companies began to offer many services for free, including e-mail, video uploading sites such as Youtube, and Groupon, that shows local discounts and coupons. All these services offer the free use by advertisers that mine the data to market to consumers, putting people even more at risk for privacy loss.

Artificial Intelligence

While many people were struggling to stay up-to-date or keep their computers working properly computer scientists made one of the hugest accomplishments in AI history. A supercomputer, nicknamed "deep blue" was created and programmed to challenge the world champion of chess Gary Kasporov. Chess is a very complicated board game considered an intellectual hobby or career by many. For those unfamiliar it is a game consisting of three parts - opening, middle game, and end game. Each player has equal pieces and there are no instruments of probability such as dice, cards, or tokens. From many different positions there are thousands of possible outcomes. One blunder, or bad move, could determine the result of a game, even as small as a captured pawn.

In 1996, Deep Blue challenged Gary Kasporov in a six game match and lost. In 1997 they had a rematch and Deep Blue won by one game - one point. Other examples of A.I. are Watson, which defeated the best Jeopardy players in 2011. Many programs and engines have been made to simulate players in many different computer languages.

The biggest problem people have with accepting A.I. and their more human-like hybrids called androids is that machines don't simulate original thoughts, language, or emotion. Emotion is an artificial feeling, however, that not all people frequently express.

Computer programs do exist to simulate the frequency, pitch, and volume created naturally human vocal chords such as Text-to-Speech software programs. In 2013, a team of scientists (and one who had attached a robotic arm to replace his missing wrist) created a bionic man out of the most advanced technology available at the time - artificial heart, Japanese robot that is able to move incredibly slowly, a few other organs, computer generated face and skin to represent the human face, and an A.I. program that responds to input via the Internet with audible responses. While the project wasn't close enough to completion to fool anyone about the type of sentient life interacting - robot or human, perhaps in half a century the distinction will not be so easy.

While A.I. is still a theoretical subject in computer science there are still many subjects useful and important that are available to the public and being put in many computer devices such as mobile phones, tablets, laptops, and personal computers.

Gaming

After many decades of computers used solely for adding, subtracting, and typing they began to be used to play games. First, text-based games were created and programmed. Then some of the first graphics games started to get programmed such as Pong in 1972.

There was a large industry boom that occurred in the computing world, starting a revolution of gaming that went through many generations in the upcoming decades. Now, 2014, the video game industry is a multi-billion dollar industry, as big or bigger than movies and other types of media and perfectly capable of competing in the entertainment business. First, arcades were created for each individual then many gaming systems were started to play several or thousands of games. Many systems failed. The winners included Nintendo, Commodore, Sega, and Atari. Future generations of gaming developed and many systems failed and others succeeded. Nintendo and Sega succeeded.

Then in the '90s new systems began to get developed and be released. Sony Playstation and Microsoft Xbox were created. Sega lasted through the decade, with its last system being the Dreamcast released circa 1998. Gaming systems use graphics chips to do a majority of the processing since there is a large use of images on video games.

Virtual Reality

The first mass-produced VR device was the Virtual Boy released in 1995. Far from the sci-fi dreams of the Holodeck of Star Trek, and other VR in pop culture, it was considered one of gaming's greatest failures. The screen was red and worn on the head. Instead of making the person feel like they were in another world, the headset had ugly graphics and was basically just a small monitor planted in front of the eyes. It was discontinued shortly after release.

However, a tech enthusiast by the name of Palmer Lucky began to make improvements on the system later, once better technology was available, including 3D rendering. From the article I read in Wired, the basic concept is a modified smart phone screen that is able to move on the z-axis that it detects. Other hobbyists were interested in his progress and he sent them beta versions of the technology, attracting the interest of one of the developers of Doom, the video game. Oculus Rift, the company that will be releasing the product was acquired by Facebook for billions recently, in 2014. This lead to some concern about the direction of the company. Initially it was geared towards video game VR, but may be used more for social applications now. As of right now, there are no official VR machines they are selling, but some prototype DIY packages can be bought online that may be pretty similar to what is released to the general public soon.

Programming Languages

Programming languages have been around for 100 years or 3% of the total history of computing machines. They are a set of instructions for computing devices, not a method of communication so do not consist of verbs, adjectives, objects, nouns, etc. People claim to be "fluent" in C++ or Python, but this is a misnomer. Actually, they are efficient in those languages. While debating semantics of the knowledge of how to code does not explain the history of coding I'll continue with the history of many major ones.

Since computers are hardly speaking to each other, other than through networks there really is no original verbal speech like through human interaction. Seeing as androids, robots, and cyborgs are mostly science fiction machines are generally bought and then treated as slaves sitting in an office, home, or carrying case. And why would they be treated differently when their main function is to display pixels on a monitor, drop items in a vending machine, or manage banking and shopping functions?

Binary is simple, zero or one, on or off switched in a circuit board by transistors. Yet displaying negative numbers is more complicated. Adding, subtracting, and multiplying requires difficult math.

Assembly makes this easier, yet try finding someone who has no problem coding in this language. Simple math and displaying text is not the most difficult task, yet many struggle even with this. Looping in assembly and managing hexadecimal addresses is more time consuming and tedious, but for the left-brainer this is where the fun begins.

In 1957, the first appearance of FORTRAN started being bootstrapped in compilers. Of course, pathetic by todays' standards it paved the way for more complicated instructions.

1959, COBOL appeared. Still used today it came around in the WW2-Reconstruction era. Notice the nearly universal English commands. That could easily be changed.

1970, Pascal programming language started showing up, named after the French mathematician who said famously "If you believe in God. You lose nothing...but if you disbelieve in God you gain nothing." Quite the wager.

1972, C programming language. The inception of C++ and C#. These are all still considered real programming languages- not server side scripting languages that are practically useless outside of web browsers.

1987, PERL. Used to deliver most e-mail messages. Popularly used in shell terminal and with CGI.

Racket and Scheme started in 1994, an entire year before Java. The notorious heavily paranthesized languages. Started by a team of semi-anarchists, this language is used in AI and infamous for the many parenthesis used to group sets of code together.

1995. Java- the programming language not the Indonesian Island. Mainly created by Oracle to fix the buffer overflow errors common in C++. Don't allow people to use all the memory they want for those variables.

It's noteworthy to mention quite a few languages are similar. C and its variations, Java, and PHP are closely related. Javascript looks similar to Java. Python is one of the newer languages that resembles Racket and C++.

In web browsers, getting the code to work correctly can be a great challenge. Understanding what makes the average user behaves like is essential to make a functioning website. Since code can work in one version of a browser, but not in another and knowing that there are several different www programs with their own separate versions can make programs with their own separate versions can make displaying a website across the programs difficult. This is called cross-browser compatibility.

As python was gaining popularity for being easy-to-learn and omitting semi-colons to reduce hours of debugging many frameworks were being created. Since CMS can allow its own rules it can restrict code to each class. In its most basic form a web page consists of markup - HTML or XML. CSS is used to style the page (i.e. color of the page, font, font-family used, plus many other options.) Other web-based languages such as ASP, PHP, VB.Net allow access to databases and text-based files. The security of these means typically fluctuate from time to time.

One important thing about programming languages is understanding the different purpose of variables in each language. There are reserved words, keywords, constants, operators, expressions, functions, and identifiers. Compilers are used to make new programming languages, and typically a compiler compiles a new language from an existing language, but in rare circumstances a compiler and boot-strap itself and compile using its own language.

Revelations

The whole world became reliant on computers, but the date format used inside code was 2-digit year instead of 4 or greater character length. So in 1999 A.D. when it was close to year 2000 people became afraid that so many things dependent on computers such as banking and ATM's and such would malfunction and spit money out to whomever. Programmers went in though and changed the date format and the feared apocalypse never occured. The hysteric event was called Y2K.

The Dreaded Robot Takeover

Y2K is not the worst thought up thriller from a singularity where machine intelligence surpasses human intelligence. The matrix is a good example of a well thought up nightmare about a future-gone-wrong happens and machines enslave humans and use them as a source of energy. While humans do not emit a significant source of electrical current to unconsciously power armies of despotic machines, we saw in the movie that they were still vulnerable to an electromagnetic pulse if such an event were ever to occur.

While of course the future of things is unknown it doesn't stop people from trying to predict it. Rumors circulate that extremely rich secret societies, such as Bilderburg, would like to fund robot armies to fight in wars. However, from a more utilitarian approach others have suggested other climactic events - robots may be able to do most human jobs by 2030 and A.I. could pass human intelligence by 2045. I don't know if the next century will be like Blade Runner, or Terminator, or Star Wars, but it's clear that new advances in technology will be patented and sold and continue to lead the D.J.I.A.

Human & Computer Similarities

  Human Computer
  heart processor?
  blood electricity
  hair dust
  eyes camera
  ears microphone
  mouth speaker
  bowels doesn't excrete waste
  umbilical chord internet
  brain circuit breaker/switch
  DNA Syntax
  Short-term Memory RAM
  Long-term Memory ROM
Status: awake/asleep on/off
Energy: food, nutritious liquids various types of batteries, electrical outlet
Building blocks: Carcinogenic cells Corrupt data/spam

Automobile & Computer Comparison

Automobile Computer
engine processor
fuel source electric/power plant
transmission transformer
ECM mother board
catalytic converter power outage

Reality vs. Science Fiction

With CGI and other film editing techniques it's sometimes hard to tell the progress of robotics. It's safe to say that if a robot isn't being sold as a servant through megamarket vendors, such as Wal-Mart, or online marketplaces, like Amazon, then it probably isn't fully functional in the laboratory or wherever it's being made. A robot-like machine was made for the movie iRobot, but it wasn't functional. CGI was used to make the robot manuever, and talk. In Prometheus David was an android. No special effects were needed for the film since they did not show any footage below the skin. If you are wondering Michael Fassbender, who played David in that film, is a human. In Terminator 2, the crew used pulleys and metal to simulate robotic arms and metallic compounds mixed with makeup to fake a metallic surface under the skin. The liquid Mercury-like T-1000 android was digitalized using several software programs including Adobe Photoshop.

While there have been hundreds of video games that use robots there isn't much mystery as to their design. 2D or 3D they are all graphically done, sometimes using green screen and technology mapped from human movement.

Minority Report features hover vehicles, retinal scans, fingerprint scans, new post-modernly designed cities, virtual reality, and holograms. The floating touchscreen monitor and keyboard could have been done using projection.

The TV show Knightrider, 1982, had a self-driving car and voice recognition system similar to Siri for Apple iPhone. While nowadays working prototypes of self-driving cars are being developed by companies such as Google back in 1982 it was all science fiction. The movie Demolition Man, 1993, also had a self-driving car that was restricted to a magnetic rail on the road. A few Batman movies also use self-driving vehicles to meet Batman by his phone. In Knight Rider, KITT (Knight Industries Two Thousand) fools viewers into automatic driving by putting the driver in the backseat, hiding him from view, or by being towed by cable, therefore appearing to drive without a driver in the front seat. The voice recognition system was actually an actor in a sound recording booth who read a script and they added the audio into episodes separately.

Debugging Buggy Code and Repairing Broken Hardware

Why Continue Developing this Field?

Good question. Some people argue that computer science and engineering is not useful, but it's actually used in everyday devices such as for GPS, and self-driving vehicles are being developed and tested that should hopefully be ready by 2020 or before. Also, robot-assisted surgery is becoming more common to help patients and doctors with surgical procedures. The same algorithms that were used to make a computer the top Jeopardy champ are being simultaneously developed, and probably patented by IBM, to work on helping doctors, nurses, EMT's and medical personnel heal patients.

What about the Dangers?

You mention that this technology could be used to build robot armies. I don't want my robot to start a mutiny and kill me or my family, or my self-driving car to accidentally drive off a cliff. How will the risks all be dealt with?

That's an important question and has been addressed decades ago by science-fiction writers, Isaac Asimov, and also by Turing, as well as modern day scientists. This is actually a philosophical question that many people disagree with. If an android is able to feel emotion, make intelligent decisions, and potentially pay taxes, hold a career, pass highschool, or even serve in the military all while being nearly unrecognizably different than a human shouldn't they have the same rights as a normal citizen? Many people think that if they become a threat to humans they should have an automatic shutdown sequence so that they won't harm anyone. Yet, if someone is harming the android and someone possible paid $50K, that they earned over a decade, to buy the android at what point is it ok for the android to defend itself? This is similar to a psychopathic aggressor fighting, or trying to maim and kill a crowd of innocent people, and the crowd either responding by playing dead or committing suicide.

Also, the idea that an android could potentially commit suicide is a problem as well, such as if it was responsible for a very important task such as rescuing patients from death, serving food to a large community, was in charge of a platoon, needed to teach a classroom of students, etc.

I think the problem of a broken android, robot, computer, or cyborg would mainly be the fault of its programmers. People have debated the ethics of war for thousands of years, yet weapons continue to be manufactured from bows and arrows to swords and guns to poisonous gas and nuclear warheads. The idea that a group of people would dedicate themselves to making equipment and robotic soldiers purely for intent of destruction is not so far fetched. Yet, the usefulness of artificial intelligence for the purpose of reducing auto accidents, saving ill or dying patients, helping the elderly with chores, as companions, to improve economics, or exploring the galaxy and universe for habitable planets cannot be ignored.

Some examples of military use of robotics and computers are drones, bomb disposal robots, Mech warrior robot battle suits, missile guidance systems, and hundreds of other examples.

The second part of the question is a problem in manually driven cars. Malfunctioning vehicles have lead to death and were recalled by auto industries in the past. Certain types of fuel pollute the environment and there are auto-accidents everyday that lead to fatalities, yet automobiles aren't banned and made illegal. Cars, trains, airplanes and other forms of transportation can be dangerous yet we deal with the risks because we want the benefits of being able to work far away from home, to visit friends and family, to go on vacation, et cetera.

Is there evolution for computers?

Short answer: no. Long answer: possibly.

Evolution is a well-known fact in the science community for biological systems, as small as bacteria, to advanced and complicated organisms like Homo Sapiens. Evolution happens through populations though, not individuals. It is a slow process that takes millions of years for small changes to occur, such as brain capacity, walking on 2 legs instead of 4, hand size and shape, pelvic size, etc. Evolution is not limited to Homo Sapiens but extends to all organic life including marine animals, trees, plants, and insects.

Could electrical systems evolve? Well we know that genetic changes are passed on to offspring through reproduction in the DNA, which is present in every cell. Cells are present in androids, humanoids, computers, robots, machines, and everything on the planet. However, there is no reproduction in electrical systems. Although, bacteria reproduced asexually millions of years ago, and some still do today, along with other organisms such as worms. Zygotic bonds and genetic variation could potentially occur in the future.

In Norbert Weiner's book Cybernetics: Control of the Animal and the Machine he discusses possibility of self-reproducing machines. First, he makes some comparative analysis of insects and birds and connects their learning to the survival of their species. Darwinian strongest survive type of stuff. Then he begins to discuss chess theory, how a machine at this time circa 1950's can beat amateur chess players but still struggle to defeat professionals (which was done by Deep Blue in 1997). On another side rant he talks about how Napoleon won a few battles, mentions some tales. Computers have connectors called male and female connectors, which sort of resemble genitals. Of course, there is no gestation that results. No offspring. The way in which old machines pass on anything is by old code, learning from their mistakes. Computers get faster, are able to do more things, have a new processor, can display new colors or images better, transmit video at 4G instead of 3G, fix or delete viruses and worms, make online transactions safer, can encrypt using more bits, update buggy code, remove backdoors, or help showcase an entirely new technology such as holograms or virtual reality devices.

How to Fix Common PC Problems

Step 1: Identify the main culprit
PC stands for "personal computer", not "politically correct." Is your computer too slow? Is it having trouble connecting to the Internet? Not playing videogames correctly? Constantly crashing? Infected by malware?
A: It could be one of those issues, or a combination of all three.

Step 2: Diagnose the problem
Many times you have too many browser extensions, start-up programs running, or adware which you mistake for a virus or worm. Yes, there is malware out there, but most Anti-virus programs can detect it, and many times remove the file. Fixing a software problem is generally cheaper than getting a new chip.

Step 3: Fix the problem
Remove the malware, update to the latest stable release, uninstall unnecessary programs, etc. Try running your computer in Safe Mode to only load the applications you need. With memory very cheap nowadays programmers take shortcuts and write sloppy code rather than do it efficiently. High-level code many times makes many unnecessary steps in solving a problem either with OOP, recursion, or sorting. A program that could do it only 10MB large in 1990 now is 50GB large with every bell and whistle imaginable. That type of file is going to take up the majority of processor load before any other queries can be made.

Try closing tabs you aren't using. 3-5 tabs should be maximum opened ideally. You may need to buy new RAM, or get a new computer entirely. Save your work and close any programs you aren't using. Disable start-up items unnecessary to the boot process.

Step 4

WIP Back <<