Computers write news at Thomson

This story from by Aline van Duyn in New York...

First it was the typewriter, then the teleprinter. Now a US news service has found a way to replace human beings in the newsroom and is instead using computers to write some of its stories.

Thomson Financial, the business information group, has been using computers to generate some stories since March and is so pleased with the results that it plans to expand the practice.

The computers work so fast that an earnings story can be released within 0.3 seconds of the company making results public.

By using previous results in Thomson’s database, the computer stories say whether a company has done better or worse than expected.

“This is not about cost but about delivering information to our customers at a speed at which they can make an almost immediate trading decision,” said Matthew Burkley, senior vice-president of strategy at Thomson Financial. “This means we can free up reporters so they have more time to think.”

Mr Burkley said the computer-generated stories had not made any mistakes. But he said they were very standardised.

“We might try and write a few more adjectives into the program,” he said.

Follow me on Twitter: @IanYorston

Cities, Cyborgs and Contemporary Art

The Institute of Contemporary Arts is running a talk on Future Cities entitled The Cyborg and the City.

So the cyborg future is pretty mainstream now...

Cyberspace is dead, say the new gurus of mobile communications, and long live the cyborg. The network of wireless connections between ourselves, other people and our surroundings is said to be transforming how we navigate our way around the city.

Almost invisibly, we have all turned into ‘cyborgs’ or ‘electronomads’ — human bodies with embedded digital extensions — and have the city as our network. But is wireless IT as transformative as its enthusiasts claim? And what are the implications of the mobile revolution for activists, urban architects and designers?

Speakers: William J Mitchell, Professor of Architecture and Media Arts at MIT and author of Me++ and Placing Words; Kevin Warwick, Professor of Cybernetics at Reading University and pioneer of the surgical implantation of mobile devices under human skin.

Fri 04 Nov 2005 at 19:00 in the Nash Room at the Institute of Contemporary Arts in London. Full Price : £8, Concession : £7, ICA Members : £6.

Link: Institute of Contemporary Arts.

Follow me on Twitter: @IanYorston

How our human interface may evolve...

Eyetap: The EyeTap Personal Imaging Lab (ePI Lab) is a computer vision and intelligent image processing research lab focused on the area of personal imaging, mediated reality and wearable computers. Originally founded in 1998, it was formerly known as the Humanistic Intelligence Laboratory (HI Lab) at the University of Toronto by Professor Steve Mann. Its mandate from the beginning has been to research wearable computing and cybernetic concepts and to channel developments into practical, market-driven products and processes.

Steve Mann has also written Cyborg: Digital Destiny and Human Possibility in the Age of the Wearable Computer

Many of these ideas tie in with the MyLifeBits Project - Microsoft BARC Media Presence Group which is a lifetime store of everything. It is the fulfillment of Vannevar Bush's 1945 Memex vision including full-text search, text & audio annotations, and hyperlinks. There are two parts to MyLifeBits: an experiment in lifetime storage, and a software research effort.

The experiment: Gordon Bell has captured a lifetime's worth of articles, books, cards, CDs, letters, memos, papers, photos, pictures, presentations, home movies, videotaped lectures, and voice recordings and stored them digitally. He is now paperless, and is beginning to capture phone calls, IM transcripts, television, and radio.

The software research: Jim Gemmell and Roger Lueder have developed the MyLifeBits software, which leverages SQL server to support: hyperlinks, annotations, reports, saved queries, pivoting, clustering, and fast search. MyLifeBits is designed to make annotation easy, including  gang annotation on right click, voice annotation, and web browser integration. It includes tools to record web pages, IM transcripts, radio and television. The MyLifeBits screensaver supports annotation and rating. We are beginning to explore features such as document similarity ranking and faceted classification. We have collaborated with the WWMX team to get a mapped UI, and with the SenseCam team to digest and display SenseCam output.

Follow me on Twitter: @IanYorston

Japanese develop 'female' android

Japanese scientists have unveiled the most human-looking robot yet devised - a "female" android called Repliee Q1.

She has flexible silicone for skin rather than hard plastic, and a number of sensors and motors to allow her to turn and react in a human-like manner. She can flutter her eyelids and move her hands like a human. She even appears to breathe.

Professor Hiroshi Ishiguro of Osaka University says one day robots could fool us into believing they are human.

Repliee Q1 is not like any robot you will have seen before, at least outside of science-fiction movies.

She is designed to look human and although she can only sit at present, she has 31 actuators in her upper body, powered by a nearby air compressor, programmed to allow her to move like a human.

Link: BBC Technology.

Follow me on Twitter: @IanYorston

Don't Hate Me Because I'm Digital

Kaya is ravishing.

She has full lips, long lashes, and a slightly upturned nose. Her expression radiates confidence and power, and her smooth skin is well scrubbed and dotted with freckles. But she doesn't have much of a body. At all. In fact, she exists only from the neck up. Kaya is a CG model, a 48,200-polygon beauty created by an artist in São Paulo, Brazil. And she's sure to be a finalist in the Miss Digital World beauty pageant.

he man behind the event is Franz Cerami, an Italian promoter who's trying to start the world's first CG talent agency. His dream is to manage a bevy of virtual beauties, posing and costuming them for pinup calendars, videogames, ads, and movies. The benefits of digital models are obvious - they never age, never have bad hair days, and can be on location in Tokyo, Paris, and Hollywood simultaneously.

But there are downsides. For example, all of the talent that Cerami first auditioned looked creepy and waxen. CG artists call this the "uncanny valley," the point at which a near-human model looks so real that every flaw and shortcoming is thrown into high relief. (Ever seen a child terrified of mannequins? Blame the uncanny valley.)

Cerami doesn't give up easily. (When contacted for this story, he kept inquiring if Vogue, one of Wired's sister publications, needed any CG talent for its next cover.) He's betting that if he casts his net wider, he can find virtual women who leap across the uncanny valley. Last January, Cerami announced his digital beauty pageant, and now the finest 3-D artists in the world are sending him glamour shots of hi-res divas that make Lara Croft look like Ms. Pac-Man.

Cerami thinks of the models as people - artists are asked to provide personality profiles, even birth dates, for their creations. To avoid scandals à la Vanessa Williams, models who have appeared in pornographic works, "even as extras or cameos," are disqualified, according to the agency's guidelines.

Link: Wired 12.11: Don't Hate Me Because I'm Digital.

Follow me on Twitter: @IanYorston

'Living' robots powered by muscle

An important step in bio-robotics...

Tiny robots powered by living muscle have been created by scientists at the University of California, Los Angeles.

The devices were formed by "growing" rat cells on microscopic silicon chips, the researchers report in the journal Nature Materials.

Less than a millimetre long, the miniscule robots can move themselves without any external source of power.

The work is a dramatic example of the marriage of biotechnology with the tiny world of nanotechnology.

In nanotechnology, researchers often turn to the natural world for inspiration.

But Professor Carlo Montemagno, of the University of California, Los Angeles, turns to nature not for ideas, but for actual starting materials.

In the past he has made rotary nano-motors out of genetically engineered proteins. Now he has grown muscle tissue onto tiny robotic skeletons.

Living device

Montemano's team used rat heart cells to create a tiny device that moves on its own when the cells contract. A second device looks like a minute pair of frog legs.

"The bones that we're using are either a plastic or they're silicon based," he said. "So we make these really fine structures that mechanically have hinges that allow them to move and bend.

"And then by nano-scale manipulation of the surface chemistry, the muscle cells get the cues to say, 'Oh! I want to attach at this point and not to attach at another point'. And so the cells assemble, then they undergo a change, so that they actually form a muscle.

"Now you have a device that has a skeleton and muscles on it to allow it to move."

Under a microscope, you can see the tiny, two-footed "bio-bots" crawl around.

Professor Montemagno says muscles like these could be used in a host of microscopic devices - even to drive miniature electrical generators to power computer chips.

But when biological cells become attached to silicon - are they alive?

"They're absolutely alive," Professor Montemagno told BBC News. "I mean the cells actually grow, multiply and assemble - they form the structure themselves. So the device is alive."

The notion is likely to disturb many who already have concerns about nanotechnology.

But for Carlo Montemagno, a professor of engineering, it makes sense to match the solutions that nature has already found through billions of years of evolution to the newest challenges in technology.

Link: BBC NEWS | Science/Nature | 'Living' robots powered by muscle.

Follow me on Twitter: @IanYorston

They're Robots? Those Beasts!

The New York Times: registration Required

Dr. Ayers is one of a handful of robotics researchers who regard animals as their muses. Their field is often referred to as biomimetics, and the researchers who are developing robotic lobsters, flies, dogs, fish, snakes, geckos and cockroaches believe that machines inspired by biology will be able to operate in places where today's generation of robots can't go.

"Animals have adapted to any niche where we'd ever want to operate a robot," Dr. Ayers said. His RoboLobster, for instance, is being designed to hunt for mines that float in shallow waters or are buried beneath beaches, a harsh environment where live lobsters have no trouble maintaining sure footing.

Another researcher, Howie Choset of Carnegie Mellon University, has been testing sinuous segmented robots based on snakes and elephant trunks that may be the perfect machines to search for survivors inside the rubble of structures destroyed by explosions or natural disasters.

But replicating biology isn't a breeze, and some think that despite the well-publicized introduction of Sony's toy dog, Aibo, in 1999, useful biomimetic robots may still be many years off.

Further information


Carnegie Mellon University's snake robot

RHex, a six-legged robot inspired by the cockroach.


Mecho-Gecko, developed by iRobot Corporation and the University of California, Berkeley


Northeastern University's robot lobster (click on any link under Online Animations of Biomimetic Systems)


Robotic fly project at the University of California, Berkeley

M.I.T.'s robotic fish, RoboPike and RoboTuna.

Follow me on Twitter: @IanYorston

Robotic Farmers

College of Engineering - University of Illinois at Urbana-Champaign

Farm equipment in the future might very well resemble the robot R2D2 of Star Wars fame. But instead of careening through a galaxy far, far away, these agricultural robots might be wobbling down a corn row, scouting for insects, blasting weeds, and taking soil tests.

University of Illinois agricultural engineers have developed several agricultural robots, one of which actually resembles R2D2, except that it’s square instead of round. The robots are completely autonomous, directing themselves down corn rows, turning at the end and then moving down the next row, said Tony Grift, University of Illinois agricultural engineer.

The long-term goal, he said, is for these small, inexpensive robots to take on some of the duties now performed by large, expensive farm equipment. As Grift asked, “Who needs 500 horsepower to go through the field when you might as well put a few robots out there that communicate with each other like an army of ants, working the entire field and collecting data?”

He said it's all part of the “smaller and smarter” approach.

And speaking of ants, one of the robots coming out of agricultural engineering is a foot-long “Ag Ant,” which is being designed to walk through crop rows on mechanical legs. Built for only $150, these cheap robots could someday be used to form a robotic strike force.

“We’re thinking about building 10 or more of these robots and making an ecosystem out of them,” Grift said. “If you look at bees, they will go out and find nectar somewhere. Then a bee will go back and share this with the group and the whole group will collect the food. Similarly, one robot might find weed plants. Then it would communicate this location to the other robots and they would attack the plants together as a group--an ecosystem, if you will.”

In addition to the “Ag Ant,” Grift and Yoshi Nagasaka, a visiting scholar from Japan, developed a more expensive, high-tech robot for about $7,000. This robot guides itself down crop rows using a laser mounted in front to gauge the distance to corn plants.

Meanwhile, Grift and Matthias Kasten, an intern from Germany, have built yet another robot, this one for roughly $500. The robot is equipped with two ultrasonic sensors that bounce sound waves off of objects, as well as four of the cheap infrared sensors used in simple motion detection sensors.

These low-budget robots maneuver down crop rows using what Grift calls “the drunken sailor” approach. The robot drifts to the left, senses a corn plant, then steers off to the right, senses another plant and steers back to the left. As a result, the robot weaves its way between the rows. To make turns at the end of a row, sensors detect when crop rows end and then signal the robot to turn.

Follow me on Twitter: @IanYorston

World's smallest robot flies forward


Seiko Epson has designed the insect-sized craft as a more advanced successor to its flying micro-robot, reports Japan Today.

The new version of the world's smallest robot flies autonomously according to a flight-route program sent by Bluetooth wireless from a computer.

The robot has two tiny ultrasonic motors that drive two propellers in opposite directions for lift.

Epson said the model, which is 136mm wide, 85mm tall and weighs 8.6 grams without the battery, will be on display at the Tokyo International Forum from August 27 to 30.

Follow me on Twitter: @IanYorston

Robots to save Hubble telescope?

BBC Science

The US space agency has given the go-ahead for a robotic mission to repair the Hubble Space Telescope, Nasa officials have announced.

Nasa chief Sean O'Keefe has asked for a firm mission proposal to be worked up in a year, after which a decision whether to proceed will be made.

"Everybody says: 'We want to save the Hubble'. Well, let's go save the Hubble," Mr O'Keefe said.

Nasa ceased manned missions to service Hubble after the Columbia disaster. Mr O'Keefe instructed engineers at the Goddard Flight Center in Maryland to begin serious work to put the robotic mission in to space in 2007.

Some reports suggest a leading candidate for the mission is a robot called Dextre, developed by the Canadian Space Agency. The two-armed robot, whose name is short for "dexterous", was developed for work on the International Space Station.

Follow me on Twitter: @IanYorston