5 Tech Myths: Cell Phones Don’t Cause Cancer & More


tech mythsMyths are more common than most people will admit. They perpetuate because they sound like they could be true – and nobody has time to fact-check every last detail. Eventually, as the myths are repeated time and time again, they sound more factual than the truth.

Technology is as susceptible to myths as any other niche. The complexity of the subject, combined with the rapid introduction of new, unfamiliar innovations, creates a perfect breeding ground for misunderstanding. Let’s set these tech myths straight.

RAM Usage Is Bad

tech myths

MakeUseOf will occasionally receive a question from a reader that asks about how to reduce RAM usage on a computer, tablet or smartphone. Their alarm is understandable. A user browsing the web in Windows 7 might open their task manager to find over six gigabytes ofRAM used. “Ack!” they think, “no wonder my computer is so slow!”

In truth, this relationship should be flipped on its head. RAM is very, very quick. Mechanical hard drives and some forms of flash storage (like most SD cards) are slow. By storing data that might be needed in RAM, a computer can increase the load speed of frequently accessed software. If RAM is not full of data, it’s effectively doing nothing, so why have it sit empty?

Smartphone users shouldn’t worry for the same reason. Background apps can negatively impact performance on an Android phone, but this usually isn’t because of memory. Instead, the culprit is usually an app that’s running in the background. Clearing memory appears to improve performance only because the offending app is closed to free up space.

Improperly Unmounting A USB Drive Will Delete Data

tech myths busted

Windows has long sounded the alarm about improperly unmounting disk drives. To this day, you may still receive warning messages when you remove a drive that you haven’t properly disabled through the operating system. Given the alarm, you’d think that the consequences of disobeying would be disastrous.

Not true. USB drives can be freely removed from a computer without issue in most situations. I can attest to this personally. As part of my work, I often have to move flash drives from one PC to the next, and I’ve never lost data from a drive because of it.

So why the warning? Microsoft is playing it safe. Data corruption can occur, but only if a USB drive is actively in use at the moment it is unplugged. Most users don’t do this. Still, Microsoft doesn’t want to be on the hook for the 1-in-1000th  time it does occur. And that’s why the alarm is raised even when there’s no fire.

You Don’t Need An Antivirus If You’re Careful

tech myths busted

Whenever I write an antivirus article I inevitably receive a reply from some smart-alec who claims that you don’t need an antivirus if you’re careful. Viruses come from infected files, right? So just don’t download them! You’ll be fine.

Well, actually, that tech myths couldn’t be more wrong. A decade and a half ago, most viruses were distributed through infected files, but they’ve become far more sophisticated since then. Worms, a specific class of virus, can infect any vulnerable computer through networking exploits. Other viruses spread using browser vulnerabilities. And still more are designed to spread via USB drives or local networks.

Clever users might respond by claiming people don’t have to worry if their software is up to date. This too is no guarantee. Zero-day exploits are common and even a patched system is a sitting duck. An antivirus may be able to stop such an attack (even though it’s unknown) by using heuristic detection to raise the alarm when a file behaves suspiciously. Those without antivirus, however, have no defense.

Cell Phones Cause Cancer

tech myths busted

Many consumer technologies rely on energy and therefor emit or use some form of radiation. Even radio waves are a form of radiation, and since cell phones use them, there’s been concern that having a source of radiation close to our heads could cause cancer. This has been backed up by an alarming report from the World Health Organization which labeled cell phones a “Class B Carcinogen”.

You’d expect that to be based on some fairly hefty evidence, right? Actually, the WHO report is less damning than it sounds in headlines. Class B simply means that a study has indicated that there might be a link, but the link is too weak to be definitive. Meanwhile, numerous other studies have found no link. This includes a massive Danish study involving 350,000 people that was released in late 2011.

Further evidence against the risk of cancer can be found in what we know of physics. Radiation comes in multiple forms, and humans only need to worry about radiation energetic enough to damage DNA. Ultraviolet rays from the sun, which can cause skin cancer, are over 400,000 times more energetic than those emitted from cell phones. Low energy waves like radio can’t hurt DNA, and that means they can’t cause cancer.

Everything Electronic Causes Cancer

tech myths

This means that what holds true for cell phones should hold true for other wireless devices, as well. The rise of wireless networks has caused distress about what all those waves bouncing through the atmosphere might do to our cells. The answer is simple – nothing.  Sleeping on a bed made of wireless routers would be uncomfortable, but it’s not going cause cancer.

Some users become concerned because of another alarming effect. Heat. As electronics are used, they put out heat, and that heat is absorbed by our bodies. That’s why your thighs are warm after using a laptop.

Computers can be harmful if they’re too hot, but the problem isn’t limited to electronics. Dermatologists have long known that constant exposure to heat can cause scaly, discolored skin which is often permanent. A hot computer can cause this – as can a heating blanket, seat warmer, fireplace or oven.

While skin discoloration and minor burns can be a problem to a handful of people, there’s no evidence that normal, intermediate use of a computer will cause cancer. The lesson from dermatology is simple. If something is hot, don’t hang around it too long.

Conclusion

This is merely a handful of tech myths. There are plenty more out there, ranging from the believable to the utterly outrageous. Have you heard a tech myth that you later found out wasn’t true? Tell us about it in the comments.

 

By Matt Smith makeuseof.com

Advertisements

What Is A Processor Core?


Every computer has a processor, whether it’s a small efficiency processor or a large performance powerhouse, or else it wouldn’t be able to function. Of course, the processor, also called the CPU or Central Processing Unit, is an important part of a functioning system, but it isn’t the only one.

Today’s processors are almost all at least dual-core, meaning that the entire processor itself contains two separate cores with which it can process information. But what are processor cores, and what exactly do they do?

What Are Cores?


A processor core is a processing unit which reads in instructions to perform specific actions. Instructions are chained together so that, when run in real time, they make up your computer experience. Literally everything you do on your computer has to be processed by your processor. Whenever you open a folder, that requires your processor. When you type into a word document, that also requires your processor. Things like drawing the desktop environment, the windows, and game graphics are the job of your graphics card — which contains hundreds of processors to quickly work on data simultaneously — but to some extent they still require your processor as well.

How They Work


The designs of processors are extremely complex and vary widely between companies and even models. Their architectures — currently “Ivy Bridge” for Intel and “Piledriver” for AMD — are constantly being improved to pack in the most amount of performance in the least amount of space and energy consumption. But despite all the architectural differences, processors go through four main steps whenever they process instructions: fetch, decode, execute, and writeback.

Fetch

The fetch step is what you expect it to be. Here, the processor core retrieves instructions that are waiting for it, usually from some sort of memory. This could include RAM, but in modern processor cores, the instructions are usually already waiting for the core inside the processor cache. The processor has an area called the program counter which essentially acts as a bookmark, letting the processor know where the last instruction ended and the next one begins.

Decode

Once it has fetched the immediate instruction, it goes on to decode it. Instructions often involve multiple areas of the processor core — such as arithmetic — and the processor core needs to figure this out. Each part has something called an opcode which tells the processor core what should be done with the information that follows it. Once the processor core has figured this all out, the different areas of the core itself can get to work.

Execute


The execute step is where the processor knows what it needs to do, and actually goes ahead and does it. What exactly happens here varies greatly depending on which areas of the processor core are being used and what information is put in. As an example, the processor can do arithmetic inside the ALU, or Arithmetic Logic Unit. This unit can connect to different inputs and outputs to crunch numbers and get the desired result. The circuitry inside the ALU does all the magic, and it’s quite complex to explain, so I’ll leave that for your own research if you’re interested.

Writeback

The final step, called writeback, simple places the result of what’s been worked on back into memory. Where exactly the output goes depends on the needs of the running application, but it often stays in processor registers for quick access as the following instructions often use it. From there, it’ll get taken care of until parts of that output need to be processed once again, which can mean that it goes into the RAM.

It’s Just One Cycle

This entire process is called an instruction cycle. These instruction cycles happen ridiculously fast, especially now that we have powerful processors with high frequencies. Additionally, our entire CPU with its multiple cores does this on every core, so data can be crunched roughly as many times faster as your CPU has cores than if it were stuck with only one core of similar performance. CPUs also have optimized instruction sets hardwired into the circuitry which can speed up familiar instructions sent to them. A popular example is SSE.

Conclusion


Don’t forget that this is a very simple description of what processors to — in reality they are far more complex and do a lot more than we realize. The current trend is that processor manufacturers are trying to make their chips as efficient as possible, and that includes shrinking the transistors. Ivy Bridge‘s transistors are a mere 22nm, and there’s still a bit to go before researchers encounter a physical limit. Imagine all this processing occurring in such a small space. We’ll see how processors improve once we get that far.

Where do you think processors will go next? When do you expect to see quantum processors, especially in personal markets? Let us know in the comments below !

By Danny Stieben makeuseof.com