6 Common Myths in the IT Industry Debunked

Common Myths in the IT Industry

Technology has made tremendous advances in society since the invention of the wheel in primitive ages. No matter what it is and where we are in evolution, change is always scary to some people.

With so many innovations happening so fast today, it’s understandable that there’s a lot of fear and confusion. Movies like “Lawnmower Man” and “Terminator” got the idea of virtual reality and artificial intelligence out there. They also spawned a lot of myths.

Now, most of us don’t believe that AI robots will take over the world. However, there are stories that we share, thinking we’re being helpful when we’re actually spreading fake information.

To get you past the fiction so you can focus on facts, we’ve put together a list of common myths and the reality behind them.

1. You Shouldn’t Charge Your Phone All Night

The myth is that this can ruin your battery. It stems from the fact that when you get a new phone or computer, the manufacturer recommends charging the device fully before using it.

This does give your new equipment the best chance to last when you start using it. Yet, overcharging your battery won’t destroy it, and it does not affect the overall shelf life.

2. Some Devices Can’t Get Viruses

Are you still hearing the line that Apple products and cell phones can’t get viruses? If so, you’re way behind the current level of cybercrime.

Yes, Apple started out marketing its OS as one that was nearly impossible to penetrate with a virus. In 2012, a massive attack on Macs proved otherwise, putting a massive hole in the indomitable Titanic of computer programs.

Macs truly are stronger than most  PCs. Like the Titanic, they’re not unsinkable. Neither are your cell phones. It’s less likely that they’ll get infected, but the more software you download, the higher the risk you’re taking.

3. AI and ML Are the Same Things

Often, ML and AI are used interchangeably. However, that’s incorrect because machine learning is quite different from artificial intelligence.

Artificial intelligence would be the umbrella category that machine learning fits underneath. AI refers to anything that relates to computer engineering, whereas ML is more specific. It focuses on training machines to perform tasks and acquire data.

4. There is No Bias in Artificial Intelligence

AI is indeed created to run on factual data. But since the programs themselves are designed by humans, artificial intelligence shares any subconscious or conscious biases that the developers carried with them.

Once many of the software systems are up and running, they’re taught to use news and social media to stay up-to-date. Since these sources are inherently biased in one direction, those opinions feed into the AI device and become part of its core.

There’s no way to get rid of bias completely. At its best, a machine is minimally biased and widely diverse. Some things remain subjective, even with the goal of pure objectivity.

5. You Can’t Play Real Games on a Phone

If you or your kids love games, you’ve probably downloaded your favorites to your electronic device. 

The sophistication of today’s software is well beyond the bulky hardware and minimal graphics of the first digital games. Still, there can be a lot of lagging and glitching if the game is graphic-laden.

Gamers often prefer to run their heavy graphics on an Xbox or Playstation. That’s not necessary, though, if you have an Android phone. Now, you can adjust the settings to play Playstation Portable games anywhere using PPSSPP settings. 

Instead of being stuck playing wherever there’s a wall outlet, gamers can play anywhere. As long as they have WiFI, they’ll still enjoy the high-quality graphics and fast loading.

6. Robots are Out to Take Your Job

Well, this one is actually possible, depending on your role. Artificial intelligence helps businesses to run smoother, cutting out unnecessary spending. 

In some cases, the software and machines do a much faster job, making it more cost-effective to buy a robot than to pay for a human 40 hours a week.

You’ve been in stores where self-checkout computers have scaled back customer service personnel. There is usually someone monitoring the area in case help is needed, but where ten people were, now there’s one.

Beyond the basic jobs, AI is also replacing people in healthcare and other advanced industries. Programs can look for symptoms to detect diseases, review accounts in finance to watch for fraud, and so much more.

This doesn’t mean you’ll become obsolete. Instead, your talents and skills can be used in a parallel role as employers focus on adjusting job descriptions and offering advanced training.

Conclusion

Although it’s come a long way, the IT industry is still full of a lot of those fictional stories. You may have heard of a few of these myths, but now you can debunk them yourself if they come up in conversation!

Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.