My journey from small-town hick to Mac user

A Mac user confesses that it was not the computer that he fell in love with.

(credit: Kris Connor/ Getty Images)

I grew up in a low-tech household. My dad only replaced something if it caught fire. We owned about 15 cars (mostly Humber 80s), and 13 of them were used to keep the other two running. Same story for tractors and any other farm equipment you care to name. Dad’s basic rule was that if he couldn't repair it, we didn't need it. We weren't anti-technology, but technology had to serve a purpose. It had to work reliably or at least be fun to repair.

Then I decided I wanted a computer. Much saving ensued, and after a while I was the proud owner of a secondhand Commodore VIC-20, with an expanded memory system, advanced BASIC, and a wonky tape drive... and no TV to plug it into. After begging an old black-and-white television from family friends, I was set for my computing adventures. But they didn't turn out as planned.

Yes, I loved the games, and I tried programming. I even enjoyed attempting to make games involving weird lumpy things colliding with other weird lumpy things. But I never really understood how to program. I could do simple things, but I didn't have the dedication or background to go further. There was no one around to guide me into programming, and, even worse, I couldn't imagine doing anything useful with my VIC-20. After a couple of years, the VIC-20 got packed away and forgotten.

Read 12 remaining paragraphs | Comments

Photonic crystal club will no longer admit only puny lasers

Making delicate structures in a plasma with a laser sledgehammer.

Crystal growth, imaged using a CARS microscope. (credit: Martin Jurna, Optical Sciences Group, University of Twente)

Research is like any other human endeavor, as subject to trends and fads as the fashion industry. Everyone wants to jump on the latest new thing. In the world of optics, that means photonics. I'll explain photonics in a second, because it's cool and everyone should be able to talk knowledgeably about photonics to their older relatives.

Photonics involves carefully structuring materials to bend light to the experimenter's will. But photons don't always cooperate. They're a bit like ants—while one photon doesn't do much, several photons carry off all your breadcrumbs and threaten the honey, and the entire photon colony will repossess your fridge, contents included. In other words, photonics labs are filled with the burnt remains of experiments because careless researchers cranked up the laser power.

This is kind of sad, because photonic crystals are incredibly useful, and the world of high power lasers is missing out on all the cool tricks that have been developed by the photonics community. Until now, that is.

Read 17 remaining paragraphs | Comments

Review: Airmail, an e-mail client that I don’t hate

It integrates everything beautifully and lets you focus your attention.

E-mail clients are a personal thing. Something I love is not necessarily something you love, and I'm okay with that. Maybe I'm just picky, but I am an equal opportunity hater when it comes to email clients, and I've tried quite a few. A couple of years ago I stumbled upon Airmail and never looked back. Bloop just released Airmail version 3.0 last month.

Back in the day I used Eudora, but I've also used Thunderbird, Outlook, and Apple's own Mail client. In Mac OS 9 days, I even tried out a client called Nisus Email. There are many more that have briefly messed up my inbox and then gone. I don't mind paying for an e-mail client, but it has to do its job the way I want it to. So what do I want it to do?

The first thing is that it should not be integrated with its own calendar or to-do list. This sounds strange, but I often find myself reading an e-mail that has a list of dates and times for a possible event. In an integrated client, I have to flip back and forth between tabs to really check those dates. Sure, you can usually get a mini-calendar to one side, but then you have to click through day-by-day. If the applications are separate, I can put them side by side and get a much clearer view. This works especially well when I have more than a single screen available.

Read 14 remaining paragraphs | Comments

Scientific publishers are killing research papers

Pressure to publish short articles removes details, leaves readers confused.

If I were to summarize the ideal scientific paper in four sentences, it would look like this:

  • Look at this cool thing we did.
  • This is how we did the cool thing.
  • This is the cool thing.
  • Wasn't that cool?

We like to think that the standard format (not to be confused with the Standard Model) was beautifully followed in days of yore. Nowadays, of course, it is not. Because things always get worse, right? In reality, scientific papers have always looked more like this:

  • Look at this cool thing we did, IT IS REALLY COOL, BE INTERESTED.
  • This is how we did the cool thing (apart from this bit that we "forgot" to mention, the other thing that we didn't think was important, and that bit that a company contributed and wants to keep a secret. Have fun replicating the results!).
  • This is the cool thing.
  • This thing we did is not only cool, but is totally going to cure cancer, even if we never mentioned cancer and, in fact, are studying the ecology of the lesser spotted physicist.

Call me cynical, but missing information in the methods section, as described in the parenthetical in item two, really, really bugs me. I think it bugs me more now than it did ten years ago, even though I'm no longer the student in the lab who's stuck with filling in the missing methods himself.

Read 15 remaining paragraphs | Comments

Going digital may make analog quantum computer scaleable

Digital quantum network cleans up analog noise, allows quantum computation.

Making a qubit is easy. Controlling how they communicate, however... (credit: NSF)

There are many different schemes for making quantum computers work (most of them evil). But they pretty much all fall into two categories. In most labs, researchers work on what could be called a digital quantum computer, which has the quantum equivalent of logic gates, and qubits are based on well-defined and well-understood quantum states. The other camp works on analog devices called adiabatic quantum computers. In these devices, qubits do not perform discrete operations, but continuously evolve from some easily understood initial state to a final state that provides the answer to some problem. In general, the analog and digital camps don't really mix. Until now, that is.

The adiabatic computer is simpler than a quantum computer in many ways, and it is easier to scale. But an adiabatic computer can only be generalized to any type of problem if every qubit is connected to every other qubit. This kind of connectivity is usually impractical, so most people build quantum annealers with reduced connectivity. These are not universal and cannot, even in principle, compute solutions to all problems that might be thrown at it.

The issues with adiabatic quantum computers don't end there. Adiabatic quantum computers are inherently analog devices: each qubit is driven by how strongly it is coupled to every other qubit. Computation is performed by continuously adjusting these couplings between some starting and final value. Tiny errors in the coupling—due to environmental effects, for instance—tend to build up and throw off the final value.

Read 17 remaining paragraphs | Comments

Human eye might be able to detect entangled photons

Will the quantum entanglement abyss stare back at you?

One of the less satisfying aspects of modern physics is the increasing separation between the phenomena that we measure and the experimenter. We measure almost everything today indirectly. If we operate our lab safely, we never directly detect an electron—instead, that charge creates a tiny potential difference on an amplifier. The amplifier generates a larger current that might drive a coil that is attached to a needle on a dial.

This level of indirection is the reality of modern physics. And the alternative—passing large currents through your body—is discouraged. Yet, the desire to really see what is going on is hard to resist. This has led to an interesting publication that proposes a way to detect quantum mechanical behavior directly with the human eye.

Seeing single photons

The behavior in question is entanglement. But before getting to that, let's talk about the eye. The human visual system is a pretty poor instrument as far as optics go. The eye is actually pretty good; experiments have revealed that the rods in your eye are sensitive to single photons. The brain, however, is smart; rather than try to sort out all the noise associated with every single photon detection, it tells the rods and cones not to bother it until the light reaches a certain intensity.

Read 15 remaining paragraphs | Comments

Gravitational waves may reveal stringy Universe

Pattern of gravitational waves may reveal string theory’s remnant strings.

Everyone has been pretty excited by the recent observation of gravitational waves. I know that I am prone to exaggeration, but gravitational waves really do open up a new way to observe the Universe.

At the moment, when we observe the night sky, the farther into the distance we look, the further back in time we see. But relationship is based on an assumption: the light we see has not bounced off anything in between us and its origin. Normally, this is a pretty safe assumption, because space is pretty big, and most of the material in it (like dust, etc) doesn't do much.

But in the very early Universe, before atoms had formed, things were very dense, so light scattered a lot. The scattering means that the information that a photon carried about its origin was lost. As a result, we can't really see much beyond the time when all the charged particles all agreed to stick together and create the first three elements of the periodic table.

Read 15 remaining paragraphs | Comments

How IBM’s new five-qubit universal quantum computer works

IBM achieves an important milestone with new quantum computer in the cloud.

The five qubits in IBM's quantum computer. (credit: IBM)

In the wee hours of Wednesday morning, IBM gave an unwary world its first publicly accessible quantum computer. You might be worried that you can tear up your passwords and throw away your encryption, for all is now lost. However, it's probably a bit early to call time on the world as we know it. You see, the whole computer is just five bits.

This might sound like some kind of publicity stunt: maybe it's IBM's way of clawing some attention back from D-Wave's quantum computing efforts. But a careful look shows that there is some serious science underlying the announcement.

The IBM system is, on a very superficial level, similar to D-Wave's. They both use superconducting quantum interference devices as qubits (quantum bits). But the similarity ends there. As IBM emphasizes, its quantum computer is a universal quantum computer, something that D-Wave's is not.

Read 26 remaining paragraphs | Comments

The search for hidden dimensions comes up empty again

Swinging pendulum obeys inverse square law, fails to fall into hidden dimension.

A Foucault's Pendulum. (credit: Flickr user luciasantamaria)

We have a beautiful theory that puts each of nature's forces into a single, neat package. The whole of it can be summed up in a single line of very compact—and for most, including me, incomprehensible—mathematics. At least, that is what we would like to be able to say, but this beauty is marred. Imagine the Mona Lisa with an eyepatch drawn in using crayon.

That is modern physics. The eyepatch is gravity.

There are many ideas about how to remove the crayon eyepatch from the masterpiece of modern physics and create a single, unified theory, but there's little evidence to support any of them. Among the ideas are theories involving extra dimensions (like string theory). And for nearly 10 years, physicists have been fruitlessly searching for evidence for these hidden dimensions.

Read 15 remaining paragraphs | Comments

Using the uncertainty principle against itself to gain precision

Researchers show how classical and quantum measurements beat quantum limits.

You can't beat the Heisenberg limit, but with enough math, you can come close. (credit: Focus Features)

Accurate measurement underlies a huge amount of modern technology. Atomic clocks, fiber optical communications systems, and many other types of hardware require accurate and precise measurements. The laws of quantum mechanics, on the other hand, are designed to annoy anyone obsessed with precision. In some cases, it's impossible to increase precision—not because the laws of physics prohibit knowledge but because the probe with which we measure is limited by quantum mechanics.

This limit is often referred to as the standard quantum limit. However, you can, with a great deal of pain, prepare special probes that beat the standard quantum limit. In this case, a different limit applies, called the Heisenberg limit.

You can't beat the Heisenberg limit. So the big question is "can we find a method that reduces the amount of pain required to approach it?" The answer, it seems, is yes.

Read 22 remaining paragraphs | Comments