Google's plans for a $2.7 billion initial public offering (IPO) have generated frenzied stock market attention. But they have also led some experts to consider the company's long term ambitions to develop technologies that go far beyond web searching.
"My overall sense is that they see a broader future than most people are talking about," says Dan Kusnetzky at US analyst firm IDC.
The IPO plans, announced on Thursday, have provided potential investors with their first glimpse of Google's financial records and the company's inner structure. Google rapidly grew to prominence after launching in 1998 because the "PageRank" algorithms behind its search service proved to be more efficient than existing services for locating information on the web.
Since its debut, however, Google has moved in new directions, by developing new technology for itself or buying other innovative companies. For example, Google has refined its search algorithms to let users search for product images or news stories. Google also offers online diaries or "weblogs" as well as social networking through a site called Orkut.
The latest development came on 1 April when Google revealed plans for a free email service called Gmail. Google said it would provide each user with one gigabyte of storage space, vastly more than is provided by competitors, such as MSN Hotmail and Yahoo. Gmail will also let users search through all their email quickly and will provide adverts based on the content of a message - something that has raised privacy concerns. The company has revealed that 95 per cent of its revenue now comes from advertising.
This announcement prompted some technically-minded observers to suggest that Google's long term strategy may be to provide other functionality normally associated with desktop computing. Some have speculated that Google may be working towards providing web users with an account on a massive distributed platform that would carry out tasks more usually carried out by a desktop operating system like Microsoft's Windows.
"More and more we expect to see applications that are hosted on the network and accessed thought a constellation of devices," Kusnetzky says. "And my sense is that Gmail was the first attempt to come up with a service that will be accessible from a range of devices as well as the desktop."
The IPO plans have revealed a number of intriguing facts. One is that co-founders Sergey Brin and Larry Page do not in fact own the patent for PageRank. This belongs to Stanford University, although Brin and Page have exclusive rights to it until 2011. Another is that Google says it is aiming to raise $2,718,281,828 from its share sale, a number identical to the important mathematical constant e, which is the base for natural logarithms.
Follow me on Twitter: @IanYorston
Imagine a future in which databases are populated with accurate, valid, exhaustive, rapidly updated data where users find what they want all the time; where drug discovery costs and development time are slashed and animal experimentation is reduced through early identification of unpromising paths; where new insights are gained through integration and exploitation of experimental results, databases, and scientific knowledge; where product development archives and patents yield new directions for R&D; and where searching yields facts rather than documents to read.
This is the potential of text mining.
The JISC, BBSRC AND EPSRC have announced funding of 1m to establish a National Centre for Text Mining. The remit of the Centre, the first publicly funded centre in the world, is to contribute to the associated national and international research agenda, to establish a service for the wider academic community, and to make connections with industry.
Text mining attempts to discover new, previously unknown information by applying techniques from natural language processing, data mining, and information retrieval:
> To identify and gather relevant textual sources
> To analyse these to extract facts involving key entities and their properties
> To combine the extracted facts to form new facts or to gain valuable insights.
Text mining finds applications in many diverse areas of wide interest such as drug discovery and predictive toxicology, protein interaction, competitive intelligence, protection of the citizen, identification of new product possibilities, detection of links between lifestyle and states of health, and many more.
Led by UMIST, the National Centre for Text Mining will be run by an internationally leading consortium. The consortium has four UK partner institutions: UMIST, the Victoria University of Manchester , the University of Liverpool, and the University of Salford. These core partners are extended by international partners: the University of California Berkeley, the University of Geneva, the San Diego Supercomputing Centre, and the University of Tokyo, with the European Bioinformatics Institute having presence on the Technical Directorate. It is anticipated that the Centre will engage as part of the related emerging networks of excellence.
The Centre will be initially focused on biological and biomedical science. This area of science has the largest user community and the fastest growing literature, and the area where most applications research in text mining is being undertaken. At the same time, the tools developed by the Centre will be of interest and relevant to the needs of the wider academic community. A major challenge for the Centre will be to handle efficiently and robustly very large volumes of text and the intermediate data produced while processing.
Visa International is experimenting with credit cards that include a small display screen where customers could view recent transactions, bank balances, or local currency exchange rates, says Deborah Arnold, Visa's vice president of global consumer strategies.
About 10 percent of Visa's 1.2 billion credit and debit cards in use worldwide already have a built-in computer chip, and that proportion is growing rapidly, Arnold says. The chip can store information that could be displayed on a small screen running across the top of a card, in a position similar to the magnetic strip on the back of most cards today.
Visa expects to have produced working prototypes of the devices in about a year's time, Arnold says, although with security and other technology issues still to be ironed out, it's likely to be a few years before such cards become widely available. In addition, the LCD-like screens being used in early designs are too fragile for everyday use, Arnold says.
"The flexible credit card display screen has to be solid; something that lasts a long time without breaking," she says.
Another issue with the use of display screens on credit cards is cost-effectiveness, she says. "It's all expensive--the cost of the battery, the display, hooking [the display] up to the chip. The initial cards with these displays are going to be very expensive," she says.
Visa is in the early stages of deciding what information the cards should display. "It's a small piece of real estate, so you have to be careful about what to display," she says.
The banks and other institutions through which Visa delivers its cards will have a say in how much space to assign to the displays and what information services are offered, she says. Clearly, displaying personal account information on a card presents potential security concerns, and Visa is looking at technologies such as biometrics and passwords to secure access. Banks and end users will help to determine how much security is necessary, and certain information, such as points accrued through loyalty programs, might not need much protection, she says.
In general, cards fitted with a computer chip offer a greater level of security than conventional credit cards, according to Arnold. "Chips can't be duplicated easily," she says.
Showing multiple types of information on a tiny display presents challenges. Information could be displayed in a scrolling format, or buttons could be added that allow a customer to switch between different types of information. Future credit cards may even come in different shapes, allowing for a bigger screen. "There are a wide number of technical ways to solve the problem," she says.
The card displays are being made possible in part by the development of new types of "flexible displays" being discussed by vendors at the conference this week. Researchers are looking at a variety of technologies to build flexible OTFT-LCD (organic thin film transistor-liquid crystal display), electrophoretic, plasma, and OLED displays.
A credit card display could be powered by tiny batteries, like those being developed by Solicore, which displayed its wares at the conference this week. These batteries are "ruggedized" for use in smart cards, says Robert Singleton, Solicore's president and chief operating officer.
2 years ago...
E Ink has demonstrated a range of prototype "electronic paper" display screens with a thickness of just 0.3 millimeters--half the thickness of a credit card, the company said Thursday.
The display is built on a steel foil substrate and is therefore flexible as well as thin, making it suitable for rugged portable displays, E Ink said in a statement. Traditional active-matrix LCDs are built using two separate sheets of fragile glass and cannot be reduced to less than two millimeters thickness.
The first prototype features a 1.6-inch diagonal screen with a resolution of 100 pixels by 80 pixels. This display is aimed at small mobile devices such as smart cards and cell phones, E Ink said in its statement. A larger prototype aimed at handheld devices such as personal digital assistants has a 3-inch diagonal screen and a resolution of 240 pixels by 160 pixels.
E Ink displays use different technology than LCDs used in existing notebook computer screens.
An E Ink display consists of a thin plastic film, which contains millions of tiny microcapsules filled with dark and light particles which carry opposite electric charges. This sheet is bonded to the steel foil substrate. Depending on the direction of an electric field from the steel foil transistor substrate, either the dark or the light particles are drawn to the surface, generating a pixel of that color.
The displays are expected to be launched commercially in 2004, the Cambridge, Massachusetts, company said.
1 Year Ago...
For the past couple of years electronics companies researching OLED, or Organic Light Emitting Diode, displays have been making technology promises that are almost as bright as the displays themselves, but commercial products have been lacking.
The first commercial OLED began shipping in March 2003. However, both it and the latest batch of prototypes suggest that the power-reduction promises made about the technology may have been optimistic, at least for the first generation of products.
OLEDs are a fundamentally different technology to LCDs. They are made by sandwiching a layer of organic material between two electric connectors. When a charge is applied to one connector it flows through the organic material, causing it to glow.
This means that, unlike an LCD, no backlight is needed and so the entire display panel can be made thinner, lighter, and will require less power than an equivalent LCD. At least, that's what was promised. Current prototypes, on display Wednesday in Tokyo at the Electronic Display Expo, consume around the same power as an LCD and in some cases more.
A prototype 2.1-inch panel from Seiko Epson consumes around 150 mW (milli Watts) when displaying a moving image. A thin film transistor LCD of a similar size consumes just over 150 mW with its backlight switched on, making the OLED power saving negligible.
"The technology is still young," said Tsutomu Takenouchi of Seiko Epson's OLED technology division. "We hope to improve the power saving with future generations."
Prototype versions of 2.2-inch and 3.5-inch panels were also displayed by Toshiba Matsushita Display Technology. Commercial production is scheduled to begin sometime in 2004, said Jun Hanari of the company's research and development center. On power consumption, he said that in some cases, such as a still screen of black text on a white background, it could be as much as double that of a modern LCD.
However, to write off OLED technology just because it doesn't live up to promises about power consumption would be to ignore its other features, and to dismiss a market that DisplaySearch estimates will reach $3 billion in 2007.
In addition to being physically smaller, the prototype displays on show in Tokyo on Wednesday were brighter, showed more vibrant colors, and were much better at displaying moving images than similar LCDs.
One of the biggest hurdles to commercialization for many companies is the length of time the display can be used before its organic structure breaks down, said David Hsieh, an analyst at DisplaySearch in Taiwan. The problem is that the organic layer slowly succumbs to a chemical reaction that eventually renders it useless, he said.
"The stability of the organic materials is not easy to control," said Hsieh, "but if the layers can be well controlled, OLED stability is will be achievable and then commercialization will follow."
Independent school pupils are up to five times more likely to achieve the highest marks at A-level than their state school counterparts, research obtained by The Sunday Telegraph has revealed.
The statistics, compiled by the largest exam board, show that if a "supergrade" were introduced to distinguish between candidates with A grades, the gap between state and private pupils would widen dramatically. In some subjects, such as physics and German, independent school pupils - who are already twice as likely as state pupils to obtain an A grade - would become five times more likely to gain the highest mark.
The research will embarrass the Government and reinforce the suspicion of critics that it has failed to introduce an A-plus grade because this would make it far more difficult to persuade universities to discriminate in favour of state pupils. Whitehall wants universities to discriminate and is offering financial incentives to those that do and threatening financial penalties against those that do not.
Dr Martin Stephen, the chairman of the Headmasters' and Headmistresses' Conference, which represents 240 leading independent schools, said: "These figures show that pupils at independent schools are performing extremely well. "Universities should take students on the basis of their academic achievement. Going on anything other than achievement brings in discrimination that is immoral and probably illegal."
Philip Evans, the headmaster of Bedford School and the chairman of the independent schools' university admissions committee, said last night the figures proved private pupils were being discriminated against and called for an A-level supergrade. "At all levels within the A grade, independent school candidates seem to be winning hands down. Introducing different levels within the A grade would be a simple solution and make it more discriminating at the top end. It could be done tomorrow."
The new research was compiled by the Assessment and Qualifications Alliance (AQA) from the results of A-level candidates in 2003, and sets out how pupils would have performed if an A-plus grade had been available. This would have required pupils to achieve at least 560 points out of 600, rather than the 480 needed at present for a standard grade A.
In geography, just over 18 per cent of comprehensive school pupils achieved grade A in 2003, compared to 41 per cent in the independent sector. If an A-plus grade had been awarded, however, the difference would have been far greater. Only 1.7 per cent of the grade A comprehensive pupils would have achieved the top level, compared with almost six per cent from the independent sector.
In Physics, the gap is even wider. The proportion of independent school pupils gaining the standard A grade in 2003 was more than double that in the state sector. However, while almost nine per cent of independent sector pupils gained enough marks for an A-plus, the corresponding figure for comprehensive pupils was only 1.6 per cent. This means that private pupils would have been more than five times as likely to earn the top grade in Physics than those in the state sector.
A spokesman for the AQA, which deals with 250,000 A-level entries each year, said: "The paper shows that at A-level, the independent sector does much better than the comprehensive sector in the proportion achieving grade A and if there was an A-plus they would do even better."
Leading universities want greater differentiation of A-level achievement because of the time and expense of whittling down candidates.
Look at this desktop. It's a Mac OS X machine, right?
Wrong. It's a Windows XP box made up to look like a Mac...
Robotic bollards that can quickly move across a carriageway to close off lanes have been developed by US engineers.
Each 130cm-high robot takes the form of a bright red barrel which sits atop a three-wheeled motorised base. A group of the bollards can be directed into position with a laptop and a main control unit equipped with a satellite navigation system for accuracy.
Starting in 1992 (the year Eric Drexler published Nanosystems), Japan spent a decade developing the foundations for bottom-up nanotech. They worked on four things:
1. "identification and manipulation of atoms and molecules" -- Obviously of great importance for bottom-up nanofabrication.
2. "formation and control of nanostructures on the surface and at the interface of materials" -- Likewise, very important.
3. "spin electronics" -- I'm not sure why this was in there; possibly to provide a short-term profitable spinoff (no pun intended).
4. "theoretical analysis of the dynamic processes of atoms and molecules" -- This sounds like research into bonding, and quite possibly relevant to mechanochemistry.
The leader, Kazunobu Tanaka, designed a highly interdisciplinary and collaborative working environment. He called for strong university participation. He put together a laboratory with 100 scientists sharing facilities, including cafeteria and relaxation room. He also recommends active use of sabbatical leaves and flexible university curriculums.
According to one article, "Dr. Tanaka says nanotechnology in Japan will not make any progress unless project leaders and researchers with a wide outlook are brought up. He adds that the master plan for developing nanotechnology in Japan should be discussed from the mid- and long-term viewpoint by young researchers with strong physical and intellectual ability."
This sounds to me like a very effective process for developing advanced technology. And this is not just theory; it's been put into practice in a decade-long foundational project that finished two years ago. Japan has now put hundreds if not thousands of research-years into bottom-up fabrication. And they've had plenty of time to think about the implications and applications.
I'm encouraged that they appear to be socially conscious. The reason for the sabbaticals is to allow the researchers to "reconfirm the positioning of their own studies in society." And the purpose of the flexible university curriculums is "to respond quickly to changing times and to meet current social needs."
Physical chemists in China have made carbon-50 molecules in the solid state for the first time.
Lan-Sun Zheng and colleagues at Xiamen University, and co-workers at the Chinese Academy of Sciences in Beijing and Wuhan, prepared the molecules - which they describe as a long sought little sister of carbon-60 - in an arc-discharge technique involving chlorine.
The result will allow scientists to study the properties of carbon-50 with a view to exploiting its unusual properties. The method developed by the Chinese team also opens the way to making other small, cage-like carbon molecules or "fullerenes" (S-Y Xie et al. 2004 Science 304 699).
The most common fullerene is carbon-60 - also known as buckminsterfullerene or "buckyball". This molecule, which contains 60 carbon atoms arranged in a spherical structure made up of pentagons and hexagons, was first created in 1985. Since then larger fullerenes containing between 70 and 500 carbon atoms have also been produced.
All the fullerenes made so far obey the isolated pentagon rule (IPR): this rule states that the most stable molecules are those in which every pentagon is surrounded by five hexagons. However, it is not possible to satisfy this rule in a molecule with fewer than 60 carbon atoms. This means that so-called non-IPR fullerenes should have unusual properties, but it also makes them structurally unstable and difficult to synthesise. Until now, fullerenes with fewer than 60 carbon atoms have only ever been made in the gas phase.
Zheng and colleagues succeeded in stabilising and capturing solid-state carbon-50 molecules using a graphite arc-discharge method. They added 0.013 atmospheres of carbon tetrachloride vapour to 0.395 atmospheres of helium in a sealed stainless steel vessel and then applied an electric field of 24 Volts. After purifying around 90 grams of soot that contained carbon-50 chloride (C50Cl10), they obtained about 2 milligrams of C50Cl10 that was 99.5% pure.
"The C50Cl10 looks like a spacecraft or a spinning planet with 10 reactive carbon-chlorine arms ready for further chemical functionalization," team member Su-Yuan Xie told PhysicsWeb (see figure). Like derivatives of carbon-60 and 70, Xie says that carbon-50 could easily react with a variety of organic groups to form new compounds with interesting chemical and physical properties. Moreover, the technique could also be extended to synthesise other small fullerenes, such as carbon-54 and carbon-56.