How Does A Hard Drive Work?


Hard disk dissection

The average laptop in the shops for around $500 has somewhere in the region of 60GB of storage memory. You see that figure and think ‘wow ““ imagine all the movies, songs, images, files and documents I could save on that baby’, right?

But did you ever think about how it actually gets stored?

If you were to stack the equivalent capacity of CDs in front of you it would surely rise to eye-level. You can fit everything on those CDs onto that hard drive. Truly amazing for an invention that has its origins in the 1950′s and was first developed as a humble cassette tape.

 

How Does a Hard Drive Work – The Basics

hard-drive-parts

In order to fully understand a hard drive you have to know how one works physically. Basically, there are discs, one on top of the other spaced a few millimetres apart. These discs are called platters. Polished to a high mirror shine and incredibly smooth they can hold vast amounts of data.

Next we have the arm. This writes and reads data onto the disc. It stretches out over the platter and moves over it from centre to edge reading and writing data to the platter through its tiny heads which hover just over the platter. The arm, on the average domestic drives can oscillate around 50 times per second. On many high-spec machines and those used for complex calculations this figure can rise into the thousands.

Hard drives use magnetism to store information just like on old cassette tapes. For that reason, copper heads are used as they are easy to magnetise and demagnetise using electricity.

Storage and Operation

Hard-Drive-sections

When you save a file, the “˜write’ head on the arm writes the data onto the platter as it spins at high RPM often in the region of 4,000. However, it doesn’t just go anywhere as the computer must be able to locate the file later. It also must not interfere or indeed delete any other information already on the drive.

For this reason, platters are separated into different sectors and tracks. The tracks are the long circular divisions highlighted here in yellow. They are like “˜tracks’ on music records. Then we have the different sectors which are small sections of tracks. There are thousands of these from centre to edge of the platter. One is highlighted blue in the picture.

In Operation

When you open a file, program or really anything on your PC, the hard drive must find it. So let’s say that you open an image. The CPU will tell the hard drive what you’re looking for. The hard drive will spin extremely fast and it will find the image in a nano-second. It will then “˜read’ the image and send it to the CPU. The time it takes to do this is called the “˜read time’. Then the CPU takes over and sends the image on its way to your screen.

Let’s say you edited the image. Well now those changes must be saved. When you click “˜Save‘, all of that information is shot to the CPU which in turn sorts it (processes it) and sends it to the hard drive for storage. The hard drive will spin up and the arm will use its “˜write‘ heads to overwrite the previous image with the new one. Job done.

That is what that buzzing disc in your computer gets up to all day. Now, as I do with most of my articles here on MUO I shall leave you with a friendly word of advice:

If you want to look inside to further understand how does hard drive work, do so with an old one. There are a few reasons for this.

  • Once you pop open that drive, plugs on the screws will snap to tell the manufacturer you have been poking around in there. By doing this, your warranty is void immediately. Many drives actually have this warning printed on the side.
  • They’re expensive and carry a lot of important info so don’t just pop open the family PC to have a go at it. Pick up an old one on eBay.

tell us what YOU know about HDDs, share your thoughts 🙂

5 Tech Myths: Cell Phones Don’t Cause Cancer & More


tech mythsMyths are more common than most people will admit. They perpetuate because they sound like they could be true – and nobody has time to fact-check every last detail. Eventually, as the myths are repeated time and time again, they sound more factual than the truth.

Technology is as susceptible to myths as any other niche. The complexity of the subject, combined with the rapid introduction of new, unfamiliar innovations, creates a perfect breeding ground for misunderstanding. Let’s set these tech myths straight.

RAM Usage Is Bad

tech myths

MakeUseOf will occasionally receive a question from a reader that asks about how to reduce RAM usage on a computer, tablet or smartphone. Their alarm is understandable. A user browsing the web in Windows 7 might open their task manager to find over six gigabytes ofRAM used. “Ack!” they think, “no wonder my computer is so slow!”

In truth, this relationship should be flipped on its head. RAM is very, very quick. Mechanical hard drives and some forms of flash storage (like most SD cards) are slow. By storing data that might be needed in RAM, a computer can increase the load speed of frequently accessed software. If RAM is not full of data, it’s effectively doing nothing, so why have it sit empty?

Smartphone users shouldn’t worry for the same reason. Background apps can negatively impact performance on an Android phone, but this usually isn’t because of memory. Instead, the culprit is usually an app that’s running in the background. Clearing memory appears to improve performance only because the offending app is closed to free up space.

Improperly Unmounting A USB Drive Will Delete Data

tech myths busted

Windows has long sounded the alarm about improperly unmounting disk drives. To this day, you may still receive warning messages when you remove a drive that you haven’t properly disabled through the operating system. Given the alarm, you’d think that the consequences of disobeying would be disastrous.

Not true. USB drives can be freely removed from a computer without issue in most situations. I can attest to this personally. As part of my work, I often have to move flash drives from one PC to the next, and I’ve never lost data from a drive because of it.

So why the warning? Microsoft is playing it safe. Data corruption can occur, but only if a USB drive is actively in use at the moment it is unplugged. Most users don’t do this. Still, Microsoft doesn’t want to be on the hook for the 1-in-1000th  time it does occur. And that’s why the alarm is raised even when there’s no fire.

You Don’t Need An Antivirus If You’re Careful

tech myths busted

Whenever I write an antivirus article I inevitably receive a reply from some smart-alec who claims that you don’t need an antivirus if you’re careful. Viruses come from infected files, right? So just don’t download them! You’ll be fine.

Well, actually, that tech myths couldn’t be more wrong. A decade and a half ago, most viruses were distributed through infected files, but they’ve become far more sophisticated since then. Worms, a specific class of virus, can infect any vulnerable computer through networking exploits. Other viruses spread using browser vulnerabilities. And still more are designed to spread via USB drives or local networks.

Clever users might respond by claiming people don’t have to worry if their software is up to date. This too is no guarantee. Zero-day exploits are common and even a patched system is a sitting duck. An antivirus may be able to stop such an attack (even though it’s unknown) by using heuristic detection to raise the alarm when a file behaves suspiciously. Those without antivirus, however, have no defense.

Cell Phones Cause Cancer

tech myths busted

Many consumer technologies rely on energy and therefor emit or use some form of radiation. Even radio waves are a form of radiation, and since cell phones use them, there’s been concern that having a source of radiation close to our heads could cause cancer. This has been backed up by an alarming report from the World Health Organization which labeled cell phones a “Class B Carcinogen”.

You’d expect that to be based on some fairly hefty evidence, right? Actually, the WHO report is less damning than it sounds in headlines. Class B simply means that a study has indicated that there might be a link, but the link is too weak to be definitive. Meanwhile, numerous other studies have found no link. This includes a massive Danish study involving 350,000 people that was released in late 2011.

Further evidence against the risk of cancer can be found in what we know of physics. Radiation comes in multiple forms, and humans only need to worry about radiation energetic enough to damage DNA. Ultraviolet rays from the sun, which can cause skin cancer, are over 400,000 times more energetic than those emitted from cell phones. Low energy waves like radio can’t hurt DNA, and that means they can’t cause cancer.

Everything Electronic Causes Cancer

tech myths

This means that what holds true for cell phones should hold true for other wireless devices, as well. The rise of wireless networks has caused distress about what all those waves bouncing through the atmosphere might do to our cells. The answer is simple – nothing.  Sleeping on a bed made of wireless routers would be uncomfortable, but it’s not going cause cancer.

Some users become concerned because of another alarming effect. Heat. As electronics are used, they put out heat, and that heat is absorbed by our bodies. That’s why your thighs are warm after using a laptop.

Computers can be harmful if they’re too hot, but the problem isn’t limited to electronics. Dermatologists have long known that constant exposure to heat can cause scaly, discolored skin which is often permanent. A hot computer can cause this – as can a heating blanket, seat warmer, fireplace or oven.

While skin discoloration and minor burns can be a problem to a handful of people, there’s no evidence that normal, intermediate use of a computer will cause cancer. The lesson from dermatology is simple. If something is hot, don’t hang around it too long.

Conclusion

This is merely a handful of tech myths. There are plenty more out there, ranging from the believable to the utterly outrageous. Have you heard a tech myth that you later found out wasn’t true? Tell us about it in the comments.

 

By Matt Smith makeuseof.com

What Are The Differences Between Capacitive & Resistive Touchscreens?


It might not fully register, but we all know there are two types of touchscreens. There are those we find on expensive smartphones and tablets, which respond to the slightest touch, allow multi-touch and are generally highly responsive (unless you’re wearing gloves); and then there are those that have slightly longer response time, that require some pressure or a stylus, that don’t have multi-touch abilities but work no matter what you touch them with.

Whether you know what the difference is or not, you’ve probably experienced these differences yourself. When that happened, you might have wondered what causes them; why doesn’t your iPhone work when you’re wearing gloves? Why do touchscreens on feature phones behave differently from those of high-end smartphones? Why can’t you use just any old stylus on your iPad?

All these questions can be answered by two words: resistive and capacitive. The difference between these two touchscreen technologies answers all the above questions. Curious? Read on to find out exactly how it works. Note, however, that this is a simple explanation, and is not meant for engineers. Don’t expect to be able to build one of these by the end of the article!

Touchscreens In A Nutshell

difference between capacitive and resistive

Although touchscreens are becoming increasingly popular, they are by no means a new invention. The first touchscreen was invented back in the 1960s, and has gone through many changes and iterations to become the touchscreen we use today.

Touchscreens are not limited to smartphones and tablets, they are literally everywhere; from ATM machines, point-of-sale terminals, and navigation systems, to game consoles and even touchpads on laptops. Touchscreens are popping up everywhere, and are slowly taking over our lives, so the least we can do is know a bit more about how they work!

Resistive Touchscreens

The resistive touchscreen is the most common type of touchscreen. Except for modern smartphones, tablets and trackpads, most touchscreens we come in contact with are actually resistive touchscreens. As you’ve probably guessed, the resistive touchscreen relies on resistance. In that respect, it’s pretty intuitive to understand – the pressure you apply causes the screen to respond.

A resistive touchscreen is made out of two thin layers separated by a thin gap. These are not the only layers in the resistive touchscreen, but we’ll focus on them for simplicity. These two layers both have a coating on one side, with the coated sides facing each other inside the gap, just like two pieces of bread in a sandwich. When these two layers of coating touch each other, a voltage is passed, which is in turn processed as a touch in that location.

capacitive touchscreens

So when your finger, stylus, or any other instrument touches a resistive screen, it creates a slight pressure on the top layer, which is then transferred to the adjacent layer, thus starting the cascade of signals. Because of this, you can use anything you want on a resistive touchscreen to make the touch interface work; a gloved finger, a wooden rod, a fingernail – anything that creates enough pressure on the point of impact will activate the mechanism and the touch will be registered.

For this very same reason, resistive touchscreen require slight pressure in order to register the touch, and are not always as quick to respond as capacitive touchscreens such as the iPhone’s. In addition, the resistive touchscreen’s multiple layers cause the display to be less sharp, with lower contrast than we might see on capacitive screens. While most resistive screens don’t allow for multi-touch gestures such as pinch to zoom, they can register a touch by one finger when another finger is already touching a different location on the screen.

capacitive touchscreens

Resistive screens have been improving greatly over the years, and today many lower-end smartphones boast a resistive screen which is no less accurate than high-end devices. Some recent devices using resistive touchscreens are the Nokia N800, the Nokia N97, the HTC Tattoo and the Samsung Jet. Another well-known device using resistive technology is the Nintendo DS, which was the first popular game console to make use of it.

Capacitive Touchscreens

Surprisingly, it was actually the capacitive touchscreen that was invented first; the first one was built almost 10 years before the first resistive touchscreen. Nevertheless, today’s capacitive touchscreens are highly accurate and respond instantly when lightly touched by a human finger. So how does it work?

As opposed to the resistive touchscreen, which relies on the mechanical pressure made by the finger or stylus, the capacitive touchscreen makes use of the electrical properties of the human body. A capacitive screen is usually made of one insulating layer, such as glass, which is coated by a transparent conductive material on the inside. Since the human body is conductive, which means electricity can pass through it, the capacitive screen can use this conductivity as input. When you touch a capacitive touchscreen with your finger, you cause a change in the screen’s electrical field.

capacitive touchscreens

This change is registered, and the location of the touch is determined by a processor. This can be done by several different technologies , but they all rely on the electrical change caused by a light touch of a finger. This is the reason you cannot use a capacitive screen while wearing gloves – the gloves are not conductive, and the touch does not cause any change in the electrostatic field. Same goes for non-capacitive styluses.

difference between capacitive and resistive

Since capacitive screens are made of one main layer, which is constantly getting thinner as technology advances, these screens are not only more sensitive and accurate, the display itself can be much sharper, as seen on devices such as the iPhone 4S. And of course, capacitive touchscreens can also make use of multi-touch gestures, but only by using several fingers at the same time. If one finger is touching one part of the screen, it won’t be able to sense another touch accurately.

Which type of screen do you prefer? Do you like being able to use your touchscreen with any type of stylus or instrument, or do you value speed and accuracy over anything else? Share your opinions in the comments.

By Yaara Lancet makeuseof.com

What Is The Difference Between An LCD And An LED Backlit LCD Display?


This subject is complex because it’s simple. The differences between LED vs LCD TV are subtle, which can make it difficult to understand the difference. It’s an important distinction, however, because it can significantly impact image quality as well as price. I’m also going to explain the differences between LED displays – not all of them are built the same.

The Core Question – What’s LCD vs LED?

led vs lcd tv

LCD, or Liquid Crystal Display, is the fundamental display technology used by most monitors, televisions, tablets and smartphones. It consists of a panel of liquid crystal molecules that can be induced by electrical fields to take certain patterns which block light or allow it through.

Color LCD displays have green, blue and red sub-pixels in each pixel. The intensity of light allowed through each sub-pixel is carefully controlled to create a detailed picture capable of displaying millions of different colors.

However, the crystals create no light of their own. It’s possible to light an LCD using reflected ambient light (the Nintendo GameBoy Advanced operated in this way) but all LCD HDTVs have a backlight which shines light through the display.

In the past, HDTVs used cold cathode fluorescent lamps to provide this light. However, manufacturers noticed that using Light Emitting Diodes would provide equal light with less energy. It was also possible to turn individual diodes off when no light was needed, something that’s not possible with CCFL lighting.

Because the addition of LEDs for backlighting was the new feature, this was used to describe the new televisions. But the new LED TVs still use an LCD display, just like the previous models lit by CCFL tubes.

So, Why Is LED Better?

led vs lcd

CCFL tubes can’t be switched on or off while a display is turned on and can only be arranged in vertical or horizontal lines. This creates picture quality problems. Since the lighting is never turned off, dark scenes are hard to render properly, and the arrangement of the CCFL tubes can cause parts of a display to appear brighter than others.

LEDs, on the other hand, can be quickly switched on or off. This allows much better control of light. They also can be arranged in a grid across a display or in a ring around a display, which offers theoretically better light distribution. Finally, LEDs do not consume as much energy.

There are different types of LED displays, however, and each has different traits. I’ll explain each.

Edge-Lit LED

led vs lcd

The image above provides a basic example how an edge-lit display works. In this instance a green LED is shone inwards on a Christmas tree pattern. The light is guided along that pattern and creates a profile. In an edge-lit HDTV there are also light guides, but instead of trying to create a specific pattern they attempt to distribute light evenly across the interior of the television.

This technology can be used to create extremely thin displays, is generally low on power draw and relatively inexpensive compared to other LED variants. If you see an LED-backlit HDTV for a low price there’s a good chance it is edge-lit.

Edge-lit displays usually do not manage to be entirely even in their light distribution, so they suffer from uniformity issues (i.e. parts of the image appear brighter than others.) Some models offer local dimming. This feature precisely controls the light output of LEDs to display deeper black levels.

Full Array LED

led vs lcd tv

A full array LED display has a grid of LED lights behind the LCD display. They shine directly outwards, creating a bright and usually uniform picture. Most televisions with a full array are expensive, enthusiast models that offer local dimming. This can provide excellent black level performance.

There are, however, a few LED sets with a full array that lack local dimming. A television set up this way will provide the uniformity benefits of LED, but probably won’t offer black levels that are much, if any, deeper than a good display lit by traditional CCFLs.

RGB-LED

This rare technology uses colored LED lights to provide additional color and lighting control. This creates very precise colors and can also provide better detail in scenes with a lot of contrast. RGB-LED is technically a modifier of the other two types – there can be edge-lit and full array versions – but most displays with this type of backlight are full array.

There are not a lot of displays that use this technology. Dell is known to offer RGB-LED in its workstation laptops and there are some high-end televisions and monitors that offer this, such as Sony’s $5000 Bravia XBR8.

Displays with RGB-LED are almost always very, very good, but most people can’t justify the extra cost.

Conclusion

I hope this has clarified the difference between LCD and LED – or, rather, highlighted the fact that it’s not a difference so much as a confusion of terms.

If you’re wondering if LEDs are worth it over non-LED displays, the answer is that it depends. There are other competing technologies, like Plasma and OLED, which operate differently and have different traits.

Individual product quality is also a big deal. Some of the best displays in the world use LED backlighting – but there are also some very poor displays that use this technology, as well.

By Matt Smith makeuseof.com