Computers

Cool news from Intel, so I wonder if Apple is working on an iThink. Richard Gray writes in the Telegraph: Unlike current brain-controlled computers, which require users to imagine making physical movements…


It’s probably not entirely coincidental that William Gibson chose to pen this op-ed for the New York Times the week before his new book Zero History is released, but nonetheless you have to pay attention when the author of Neuromancer shares his thoughts on the future landscape of computing and artificial intelligence:

Vancouver, British Columbia

“I actually think most people don’t want Google to answer their questions,” said the search giant’s chief executive, Eric Schmidt, in a recent and controversial interview. “They want Google to tell them what they should be doing next.” Do we really desire Google to tell us what we should be doing next? I believe that we do, though with some rather complicated qualifiers.

Science fiction never imagined Google, but it certainly imagined computers that would advise us what to do. HAL 9000, in “2001: A Space Odyssey,” will forever come to mind, his advice, we assume, eminently reliable — before his malfunction. But HAL was a discrete entity, a genie in a bottle, something we imagined owning or being assigned. Google is a distributed entity, a two-way membrane, a game-changing tool on the order of the equally handy flint hand ax, with which we chop our way through the very densest thickets of information. Google is all of those things, and a very large and powerful corporation to boot…






Alan Mascarenhas writes on Newsweek:

It takes a lot to snap people out of apathy about Africa’s problems. But in the wake of Live Aid and Save Darfur, a new cause stands on the cusp of going mainstream. It’s the push to make major electronics companies (manufacturers of cell phones, laptops, portable music players, and cameras) disclose whether they use “conflict minerals” — the rare metals that finance civil wars and militia atrocities, most notably in Congo.

The issue of ethical sourcing has long galvanized human-rights groups. In Liberia, Angola, and Sierra Leone, the notorious trade in “blood diamonds” helped fund rebel insurgencies. In Guinea, bauxite sustains a repressive military junta. And fair-labor groups have spent decades documenting the foreign sweatshops that sometimes supply American clothing stores. Yet Congo raises especially disturbing issues for famous tech brand names that fancy themselves responsible corporate citizens.

A key mover behind the Congo campaign is the anti-genocide Enough Project: witness its clever spoof of the famous Apple commercial.


Rory Cellan-Jones reports on BBC News:

A British scientist says he is the first man in the world to become infected with a computer virus. Dr Mark Gasson from the University of Reading contaminated a computer chip which was then inserted into his hand.

The device, which enables him to pass through security doors and activate his mobile phone, is a sophisticated version of ID chips used to tag pets.

In trials, Dr Gasson showed that the chip was able to pass on the computer virus to external control systems. If other implanted chips had then connected to the system they too would have been corrupted, he said.



Armen Keteyian reports on the CBS Evening News:

At a warehouse in New Jersey, 6,000 used copy machines sit ready to be sold. CBS News chief investigative correspondent Armen Keteyian reports almost every one of them holds a secret.

Nearly every digital copier built since 2002 contains a hard drive — like the one on your personal computer — storing an image of every document copied, scanned, or emailed by the machine.


“We have mimicked how neurons behave in the brain,” announces an international research team from Japan and Michigan Tech. They’ve built an “evolutionary circuit” in a molecular computer that evolves to solve complex problems, and the molecular computer also exhibits brain-like massive parallel processing!

“The neat part is, approximately 300 molecules talk with each other at a time during information processing…



This is really insane, really does make it seems like these traders are just playing with Monopoly money. If only actual working people’s pensions and savings were not tied to this grand casino, we could laugh it off. See the line in bold print below, it’s priceless. Tim Paradis writes on the AP via HuffPo:
Stock Market The Ride

A computerized selloff possibly caused by a simple typographical error triggered one of the most turbulent days in Wall Street history Thursday and sent the Dow Jones industrials to a loss of almost 1,000 points, nearly a tenth of their value, in less than half an hour. It was the biggest drop ever during a trading day.

No one was sure what happened, other than automated orders were activated by erroneous trades. One possibilility being investigated was that a trader accidentally placed an order to sell $16 billion, instead of $16 million, worth of futures, and that was enough to trigger sell orders across the market.

The Dow recovered two-thirds of the loss before the closing bell, but that was still the biggest point loss since February of last year. The lightning-fast plummet temporarily knocked normally stable stocks such as Procter & Gamble to a tiny fraction of their former value and sent chills down investors’ spines.

“Today … caused me to fall out of my chair at one point. It felt like we lost control,” said Jack Ablin, chief investment officer at Harris Private Bank in Chicago.




“Why not develop music in ways unknown…? If beauty is present, it is present.”

That’s Emily Howell talking – a highly creative computer program written in LISP by U.C. Santa Cruz professor David Cope. (While Cope insists he’s a music professor first, “he manages to leverage his knowledge of computer science into some highly sophisticated AI programming.”)

Classical musicians refuse to perform Emily’s compositions, and Cope says they believe “the creation of music is innately human, and somehow this computer program was a threat…



Kopin makes advanced night vision goggles and thermal weapon sights for the U.S. Army, but they’ll soon be releasing a wearable Windows/smartphone eye display!

Imagine your smartphone feeding information to a virtual 15-inch Microsoft Windows PC display that sits in front of one eye, just beneath your line of sight. You speak commands using hands-free speech recognition to control both wireless access to the internet and your phone…


From Cory Doctorow at BoingBoing:

Scott sez,

A few weeks ago, Frontline premiered a documentary called “Digital Nation”. In one segment, the vice-principle of Intermediate School 339, Bronx, NY, Dan Ackerman, demonstrates how he “remotely monitors” the students’ laptops for “inappropriate use”. (his demonstration begins at 4:36)


He says “They don’t even realize we are watching,” “I always like to mess with them and take a picture,” and “9 times out of 10, THEY DUCK OUT OF THE WAY.”

He says the students “use it like it’s a mirror” and he watches. He says 6th and 7th graders have their cameras activated. It looks like the same software used by the Pennsylvania school that is being investigated for covertly spying on students through their webcams.

The shocking thing about this is that the privacy concerns were not even mentioned in the Frontline documentary!


21 AI experts have predicted the date for four artificial intelligence milestones. Seven predict AIs will achieve Nobel prize-winning performance within 20 years, while five predict that will be accompanied by superhuman…


A warning courtesy of PC World: The successor program to the notorious Zango spyware toolbar is being used to target users of Mozilla’s Firefox with fake browser updates, a security company has…


Peter Diamandis just held a workshop at MIT discussing a $10 million “X-Prize” for building a brain-computer interface!

Besides the ability to communicate by thought, the article argues, a Brain-Computer Interface X Prize “will reward nothing less than a team that provides vision to the blind, new bodies to disabled people, and perhaps even a geographical ‘sixth sense’ akin to a GPS iPhone app in the brain.” And one software engineer argues the technology could become commercially available within the next 10 years.



This YouTube video is becoming a worldwide sensation with news headlines like “HP Cams Can’t See Black Faces.” The official blurb from the YouTube page:

The face tracking feature of the HP web cam will not recognize or track black faces. You have to watch this video. It is hilarious!


Citing competing teams on both sides of the Atlantic, this article describes “the race to develop cognitive computing by reverse engineering the brain.”

While IBM is using the world’s fourth-fastest supercomputer, the same supercomputer is also being used by the Blue Brain project at the Ecole Polytechnique Federale de Lausanne. (The head of the project notes the difficulty in “recreating the three-dimensional structure of the brain in a 2-D piece of silicon… It’s not a brain. It’s more of a computer processor that has some of the accelerated parallel computing that the brain has.”)

Meanwhile IBM still hopes “to noninvasively measure and map the connections between all cortical and sub-cortical locations within the human brain using magnetic resonance diffusion weighted imaging.”

With rapidly accelerating advances in supercomputer architectures, can a simulated human brain be far off?