Appearing on Conan, Louis C.K. brilliantly deconstructs why he hates the technology surrounding us and why he won’t allow his kids to have smartphones – they become tools for avoiding sadness and loneliness, and thus a true understanding of the self:
Tag Archives | Technology
Balkinization on the techno-utopian cult of disruption:
Why is the term “disruption” so popular nowadays? Elite media features a parade of thinkers keen on “disrupting” old institutions. Talk of social contracts is passé in an America obsessed with technocapitalist visions of a prosperous future.
The yen for “disruption,” an empty term for empty minds in empty people, makes traditional obstacles like social contracts suspect or downright pernicious. This has led to an embrace of proceduralism by those true believers who want an app economy to be the engine of capitalism. And such people rule the world.
The view of society as an institution-free network of autonomous individuals practicing free exchange makes the social sciences, with the exception of economics, irrelevant. What’s left is engineering, neuroscience, an understanding of incentives (in the narrowly utilitarian sense): just right for those whose intellectual predispositions are to algorithms, design, and data structures.
Unfortunately, the “disruptions” pursued by Silicon Valley giants (and their well-heeled consultants) often have little to do with challenging the biggest power centers in society.
Are you excited about the new iPhone’s nifty biometric fingerprint unlock system? The police might be, too.
Courts have given mixed messages about whether Americans are protected from being forced to divulge passwords or decrypt information for law enforcement officials. Civil liberties advocates argue defendants shouldn’t have to unlock their own computers for the cops. The logic: Under the Fifth Amendment, Police can’t force you to self-incriminate by testifying, or divulging something in your mind.
It’s unclear if that same protection applies if the password is your fingerprint.
“A fingerprint is entitled to less constitutional protection than a password known in your mind,” said Hanni Fakhoury, a staff attorney at the Electronic Frontier Foundation in San Francisco. “If police arrest you and ask you for a password, you could refuse and they’d be hard pressed to force you to divulge the password.”
Of course, police already collect fingerprints after booking a suspect.
Via Forbes, Kashmir Hill reveals that the “demonic house” horror archetype may soon be coming true:
“I can see all of the devices in your home and I think I can control them,” I said to Thomas Hatley, a stranger in Oregon who I had rudely awoken with an early morning phone call.
He and his wife were still in bed. Expressing surprise, he asked me to try to turn the master bedroom lights on and off. Sitting in my living room in San Francisco, I flipped the light switch with a click, and resisted the Poltergeist-like temptation to turn the television on as well.
Googling a very simple phrase led me to a list of “smart homes” that had done something rather stupid. The homes all have an automation system from Insteon that allows remote control of their lights, hot tubs, fans, televisions, water pumps, garage doors, cameras, and other devices, so that their owners can turn these things on and off with a smartphone app or via the Web.
Will contemporary society’s ever-growing, never-ending stream of information gradually paralyze and destroy us all? In 2008 the IT consulting firm Basex claimed this as a conservative estimate, with the figure presumably rising since then:
According to our latest research Information Overload costs the U.S. economy a minimum of $900 billion per year in lowered employee productivity and reduced innovation. This is a fairly conservative number and reflects the loss of 25% of the knowledge worker’s day to the problem. The total could be as high as $1 trillion.
Information overload describes an excess of information that results in the loss of ability to make decisions, process information, and prioritize tasks. It is nothing new – it was very much on the minds of thought leaders centuries ago, including Roger Bacon, Samuel Johnson, and Konrad Geßner whose 1545 Bibliotheca universalis warned of the “confusing and harmful abundance of books” and promulgated strategies for coping with the overload of information.
Could you survive these brutal camps? Via the International Business Times:
In response to rising numbers of young people who are “pathologically addicted” to the internet, Japan is opening up so-called “internet fasting camps” to wean youths off the web. Researchers at Japan’s Nihon University estimate that about 8.1 percent of the country’s students are addicted to the internet.
The Tokyo government’s education ministry will introduce “web fasting camps” to help young people disconnect from their PCs, laptops, mobile phones and hand-held devices. Akifumi Sekine, a spokesman for the education ministry, added: “We want to get them out of the virtual world and to encourage them to have real communication with other children and adults.”
Youths forcibly removed from their beloved mobile devices may suffer withdrawal symptoms, i.e., “cold turkey.”
Via Edge.org, Bruce Sterling tells us what to not worry about:
Twenty years have passed since Vernor Vinge wrote his remarkably interesting essay about “the Singularity.”
This aging sci-fi notion has lost its conceptual teeth. Its chief evangelist, visionary Ray Kurzweil, just got a straight engineering job with Google. Despite its weird fondness for AR goggles and self-driving cars, Google is not going to finance any eschatological cataclysm in which superhuman intelligence abruptly ends the human era. Google is a firmly commercial enterprise.
We’re no closer to “self-aware” machines than in the 1960s. A modern wireless Cloud is an entirely different cyber-paradigm than imaginary 1990s “minds on nonbiological substrates” that might allegedly have the “computational power of a human brain.” A Singularity has no business model, no major power group in our society is interested in provoking one.
[Instead] we’re getting what Vinge predicted would happen without a Singularity, which is “a glut of technical riches never properly absorbed.”
As much of our time as we spend watching screens, screens will soon be watching us. Businessweek writes:
Online training technology company Mindflash on Tuesday announced a new feature called FocusAssist for iPad that uses the tablet’s camera to track a user’s eye movements. When it senses that you’ve been looking away for more than a few seconds (because you were sending e-mails, or just fell asleep), it pauses the course, forcing you to pay attention—or at least look like you are—in order to complete it.
Sound kind of creepy, even Big Brother-y? Mindflash doesn’t think so. Donna Wells, the company’s CEO, writes in an e-mail: “Our focus is making sure trainees get all the information they need to do their jobs well.”
The feature was developed by a group of Stanford University Ph.D.s who also founded Sension, a “computer vision technology” company in Palo Alto that wants to use emotion- and facial-recognition technology to treat autism and Alzheimer’s disease.
Via Wired, Jathan Sadowski writes that “tech hacks” to shield our own privacy shouldn’t be the answer we are looking for:
The notion of tech-centric solutionism: what tech hack, device, or app can I turn to for a quick fix to my privacy troubles? There’s no shortage of articles and how-to guides for securing privacy, with headlines promising “Five ways to stop the NSA from spying on you.”
Here’s the thing, though: We shouldn’t resolve ourselves to a life where cyber-hygiene and an obsession with technological solutions fools us into thinking we’ve somehow preserved our privacy.
It’s always going to be a losing battle when going against a panoptic titan whose methods are wide-reaching, constantly evolving, and classified. Just look at the fates of Lavabit and Silent Circle, the two email services that shuttered last week.
The fundamental belief in technology’s ability to “fix” everything ignores the fact that not everything needs to be fixed in the first place.
A preview of how our society eventually crumbles – subtle sabotage by algorithms in everyday machines? The FontFeed on a mind-boggling discovery:
Last Wednesday German computer scientist David Kriesel had a bizarre discovery. After scanning a construction plan on a Xerox Workcentre and printing it, he noticed the plan suddenly contained incorrect numbers. The Xerox Workcentre somehow changed the numbers whilst scanning.
On his website Kriesel analyses what causes the problem in Xerox Workcentre 7535 and 7556 machines – a compression algorithm randomly replaces patches of pixel data in an almost unnoticeable way.
Apparently Xerox machines use JBIG2, an algorithm that creates a dictionary of image patches it considers similar. As long as the error generated by these patches is not too high, the machine reuses them instead of using the original image data.
Why is this issue so crucial? First of all, these are widespread machines, commonly used in service centres and copy shops, and Xerox seemed to be unaware of the issue until David Kriesel notified them last Wednesday.