chitika1

chitika

Sunday, August 29, 2010

Flying a simulated Boeing Dreamliner (photos)

Welcome to BoeingTraining levels
Full Flight Simulator
Simulators


Google gives real-time search its own page


Google gives real-time search its own page

Google has a new real-time search page that will allow users to visit a dedicated page when searching for breaking news.
Google has a new real-time search page that will allow users to visit a dedicated page when searching for breaking news.
(Credit: Screenshot by Tom Krazit/CNET)
Updated 11:23 a.m. PDT throughout with additional information and background.
Google has changed the way it presents real-time search updates, giving those search results their own section and a "conversation" view designed to cluster like-minded updates.
Back in December, Google first introduced into its search results the concept of "real-time" search results from sources like Twitter and news organizations, placing a dedicated window among regular search results that automatically scrolled through links to stories or tweets related to that topic. It still plans to highlight these types of results among regular search results, but it has created a separate page at google.com/realtime, where those types of updates can be discovered.
The page got off to a rocky start, going down about half an hour after it launched for some users, but Google said it was rolling out gradually to searchers. It can also be accessed from the left-hand side of Google's main search results page under "updates," or directly through a longer URL that Google included in its blog post until the main one reaches everyone.
Real-time search is a thorny problem: it's a lot more difficult to harness the flood of real-time content and organize it in a relevant way than it is to crawl static Web pages. Google actually has to pay real-time sources of information like Twitter for access to the "firehose" of tweets in order to pull them into search results.
Microsoft and Yahoo have also experimented with real-time search: Bing, for example, has a separate page dedicated to "social" results. Startups such as OneRiot have also tried to harness the astounding amount of content produced by social-media users.
As of the first few hours of the new page, Twitter was the main source of content within the search results, although public posts from Facebook, Google Buzz, and other social networking sites like Myspace and Friendfeed showed up in the listings. At the moment, no ads are showing up alongside the results, but it's not hard to imagine that changing should the dedicated page gain traction with searchers.
Searchers can filter the real-time results by geographic location by using the "nearby" link on the left-hand rail, but it needs a little work, surfacing results from Vermont and Mississippi on a search for "obama" filtered to produce results "nearby" the Bay Area.

Nokia 6600i unleashed in India !



The Nokia 6600i has just been launched in India and it establishes itself as the smallest 5-megapixel phone in the market. The 6600i has a slider form factor with an extremely impressive design and style.

The camera being the phones USP promises to deliver crystal clear imaging with its 8x zoom and auto focus functionality. It also comes with a dual LED flash for enhancing image quality.

Other features include FM radio, playback for all kinds of media files/formats, stereo headsets and bluetooth v2.0. A microSD card slot capable of holding a 16GB card fits perfect for this phone. 

Intel, AMD vie to rewire PC's brain


Intel and AMD are off to the races again. This time it's about making PCs not just faster, but more versatile.
The two longstanding PC chip rivals seem to agree, roughly, on one thing: the need to meld the two key PC chips, the central and graphics processing units, into one processor. But they both bring different strengths to achieve that end.
Why combine chips? Put simply, it takes less energy to move electrons across a chip than to move those same electrons between two chips, so this saves energy, resulting in better battery life for laptops. A point made by Insight 64 principal analyst Nathan Brookwood in a white paper written for AMD, but which, in some fundamental respects, applies equally to Intel.
Heterogeneous computing combines functions typically found on a graphics processor with the main CPU chip.
Heterogeneous computing combines functions typically found on a graphics processor with the main CPU chip.
(Credit: AMD)
And CPUs and GPUs are suited to different kinds of computing. CPUs can handle a broad array of tasks, while GPUs are more specialized but much faster at certain types of operations. Future heterogeneous chips could find photos and videos in your library that contain particular faces or places. Or recognize your face when you log in. In short, putting both capabilities on one piece of silicon creates a brainier chip with more processing brawn.
The question, of course, is which company will deliver the goods and drive cutting-edge PC--particularly laptop--designs in 2011? AMD claims that because it is also a supplier of GPUs, via its ATI graphics chip unit, its products are more forward-looking because of the increased emphasis on graphics that tap into key multimedia technologies like Microsoft's DirectX and Apple's OpenCL.
"Intel is understandably more CPU centric. That's Intel's view," said John Taylor, director of marketing for Fusion at AMD. "We're a provider (via ATI) of graphics chips. We're incorporating world-class GPU intellectual property into a new type of design. We look at the GPU in a consumer notebook as a very efficient compute engine as well as all of the wonderful 2D and 3D graphics capabilities," Taylor said, adding that Intel is just "sprinkling" low-level graphics on its CPUs.
Not surprisingly, Intel, the world's largest chipmaker, believes it has the upper hand because its cutting-edge manufacturing technology allows it to integrate more on a piece of silicon, sooner. Intel's Atom chip, for example, already melds two processing cores and the graphics function on a single piece of silicon.
And Intel was the first--early this year--to move to 32-nanometer technology, which allows the chipmaker to cram more functions onto the chip. (Globalfoundries, AMD's manufacturing partner, won't make that move until 2011.) The upcoming 32-nanometer Sandy Bridge architecture from Intel will represent the fruition of this effort. "Sandy Bridge combines multiple cores together with the graphics circuitry on the same chip," said Mark Bohr, Intel senior fellow. "The fact that we're an aggregate device manufacturer allows us to do internal optimization of all of these pieces and bring out a leading-edge product sooner than other companies."
So, here's a brief overview of laptop-centric technologies that AMD and Intel are planning to roll out over the next 6 to 12 months or so. Consumers, of course, will ultimately decide who prevails.
AMD's Ontario (2010): 
  • From-the-ground-up redesign; very-low-power x86 core 
  • Single piece of silicon 
  • Up to two CPU cores with a DirectX 11 ATI 5000 series GPU technology and new video decoder 
  • Targeted at Netbook, ultrathin laptops, and all-in-one PCs 
  • 40-nanometer "bulk" process; manufactured by TSMC*
  • Due to ship in Q4 2010 with laptops due early 2011 
*Taiwan Semiconductor Manufacturing Company
AMD's Llano (2011): 
  • Up to four CPU cores with DirectX 11 GPU, upgraded ATI 5000 series GPU technology and video decoder 
  • Single piece of silicon 
  • Targeted at mainstream and ultrathin laptops and certain desktop market segments 
  • 32-nanometer High K metal gate process; manufactured by Globalfoundries 
  • Due to ship in first half of 2011 
Intel's Sandy Bridge
  • Single piece of silicon, combining CPU and GPU
  • Faster on-chip communication: different parts talk via "improved inter-buses."
  • Improvements to the way instructions are executed 
  • New instructions to accelerate multimedia: Intel Advanced Vector Extension (AVX) instructions. 
  • Improved Turbo Boost: slowing down, speeding up individual cores as needed. 
  • Special circuits for handling transcoding (conversion of video/audio from one format to another) 
  • 32-nanometer High K metal gate process 
  • Due to ship in Q4 (more details to be revealed at the Intel Developer Forum in September). 
  • Up to 2 cores, 4 threads 
  • CPU integrates GPU on the same die (single piece of silicon) 
  • Low-power: dual-core version has maximum thermal envelope of 8.5 watts 
  • Shipment date: now 
A few additional items worth noting:
High K metal: Intel has been building chips--since 2007-- with this transistor technology, which, generally yields chips that are faster and run cooler. AMD won't move to this technology until 2011.
32-nanometer: Intel has also been supplying 32-nanometer chips since early this year; AMD won't get there until 2011 but it is moving to an intermediate 40-nanometer process later this year.
AMD's "Bulldozer" core: This is a new 32-nanometer chip architecture due in the first half of 2011. Targeted initially at high-end desktop and servers, it will offer multi-threading, which increases the number of tasks a processing core can handle (Intel has been offering this for a long time). Bulldozer will yield high core counts (such as 8-core desktop chips) and support new x86 instructions, including SSE4.2 and AVX.

Friday, August 27, 2010

Humanoid Robot !

'Sci/Tech' Archives

Telenoid R1 minimalist humanoid robot

Researchers from Osaka University have teamed up with the Advanced Telecommunications Research Institute (ATR) to develop a minimalist humanoid robot that recreates the physical presence of a remote user.
Telenoid humanoid robot --
Named "Telenoid R1," the teleoperated communication robot measures 80 centimeters (31 in) tall and weighs 5 kilograms (11 lbs). The portable machine features a soft silicone body that is pleasant to the touch, and it uses 9 actuators to move its eyes, mouth, head and rudimentary limbs.
Telenoid humanoid robot --  Telenoid humanoid robot -- Data is transmitted between the user and robot via Internet connection
The Telenoid R1 robot is designed to add an element of realism to long-distance communication by recreating the physical presence of the remote user. The robot's actions mirror those of the remote user, whose movements are monitored by real-time face tracking software on the user's computer. Users can also transmit their voice through the robot's embedded speakers.
Telenoid humanoid robot -- Telenoid R1 with Dr. Hiroshi Ishiguro (Osaka University)
The Telenoid R1 is endowed with only the most basic human features -- just enough to recreate the physical presence of the remote user, according to robot's creators. The robot's androgynous and ageless look makes it suitable for a wide range of users, whether they are male, female, young or old.
Telenoid humanoid robot -- English lessons can be conducted via the Telenoid R1 robot
At the unveiling in Osaka on August 1, the developers announced plans to begin selling two versions of the minimalist humanoid in October. The high-end model will be priced at about 3 million yen ($35,000), and a cheaper model will be available for about 700,000 yen ($8,000).

Thursday, August 26, 2010

Latest TECH: Flexible, full-color OLED display

On May 24, Sony unveiled what it is calling the world's first flexible, full-color organic light emitting diode (OLED) display built on organic thin-film transistor (TFT) technology. OLEDs typically use a glass substrate, but Sony researchers developed new technology for forming organic TFT on a plastic substrate, enabling them to create a thin, lightweight and flexible full-color display. The 2.5-inch prototype display supports 16.8 million colors at a 120 x 160 pixel resolution (80 ppi, .318-mm pixel pitch), is 0.3 mm thick and weighs 1.5 grams without the driver.

According to Sony, which plans to release a new line of miniature TVs this year and is bolstering efforts to develop next-generation flat-panel OLEDs, this new technology will lead to the development of thinner, lighter and softer electronics.
The company is scheduled to present the results of its research at the SID 2007 International Symposium now underway in the US.

NASA with space telescope !!

How Real Satellites/Space Telescopes Come About
artist concept of Webb telescopeA December 2007 artist's conception of the James Webb Space Telescope.Credit: NASA › Larger imageIt takes years to bring a real large space telescope from basic concept to hardware reality. First, a scientist comes up with an idea to study some aspect of the Earth or the cosmos. The idea is discussed, reviewed and developed by committees of scientists. It is proposed to NASA, who makes decisions on what missions to go forth on, and which missions to pass on. If a mission is selected for study a timeline is created to develop the mission.

One of the most difficult aspects of creating a new mission is convincing others to fund it. Once a mission is funded, the team of scientists and engineers "pitching" the mission can then investigate how it could come together. Later, NASA usually selects a prime contractor to help design the telescope and other systems that will fly on the satellite. Northrop Grumman was selected to build components for the Webb telescope. The instruments, or cameras, on the telescope are selected as well, with teams of scientists to watch over the design.

The design process usually includes a number of different designs, which are all tested to see which would yield the best result for the type of object the instrument would study. For example, various types of infrared cameras may be developed and tested, and the one that gives a scientist the best result, would be chosen to be built as a test unit.
Engineering Test Units
model of Webb telescopeThe life-sized James Webb Space Telescope model sits in front of the Royal Hospital Kilmainham, in Dublin, Ireland.Credit: Richard Bent, Northrop Grumman Space Technology › Larger imageEngineering test units, or ETUs are created before an actual instrument is built, so that engineers and scientists can make sure it would work properly. ETUs are a replica of the flight unit that can perform certain flight functions for testing purposes. ETUs are also used when engineers are practicing installation of an instrument into a satellite's mainframe or "bus." The outcome of the tests on ETUs may lead to a change in handling procedures of the actual flight instrument, but not a change in its flight construction.

Once the ETUs test successful, then the actual instruments that will fly aboard a satellite or space telescope can be manufactured. Those instruments go through their own set of rigorous tests by the manufacturing contractor, NASA and other partners. On the Webb telescope, NASA is partnering with the European Space Agency and the Canadian Space Agency.
Testing the System

Satellite and space telescope instruments can endure harsh temperature swings as big as 200 degrees Fahrenheit, micro-meteor impacts and exposure to solar radiation. On top of that, before a spacecraft like the Webb can operate in orbit, it has to survive a ride on a rocket to get there. That's where environmental testing chambers like the ones at NASA's Goddard Space Flight Center in Greenbelt, Md., come into play. Hardware gets run through NASA Goddard's centrifuge, acoustics and thermal vacuum chambers to ensure they can endure the rigors of launch.

The centrifuge simulates the increased feeling of gravity's pull during a launch. For astronauts, that's normally a few minutes at two or three times the force of Earth's gravity, measured in Gs. The Webb telescope can experience 6-7 G's due to the Ariane 5 rocket's combined acceleration and vibration. The Webb telescope will be launched on an Ariane 5 ECA rocket. The launch vehicle is part of the European contribution to the mission.

Launching a rocket carrying a satellite or space telescope creates extraordinarily loud noise, so engineers use an Acoustic Test Chamber to make sure an instrument can handle it safely. In Goddard's 42-foot-tall chamber, technicians expose payloads to the noise of a launch. To do that, they rely on 6-foot-tall speakers. The speakers (more accurately called horns) use an altering flow of gaseous nitrogen to produce a sound level as high as 143 decibels for one-minute tests. That's about the level of sound heard standing next to a jet engine during takeoff.
View of Goddard clean roomThis panorama shows the inside of Goddard's High Bay Clean Room, as seen from the observation deck. The clean room will be a home to some Webb components before the telescope is put together. Credit: NASA/Chris Gunn › Larger image 

The hardware is also tested in the thermal vacuum which exposes them to conditions they will experience in space. The chamber has massive mechanical vacuum pumps and cryopumps to ensure that the hard vacuum of space is simulated in the test chamber. The cryopumps use gaseous helium to condense remaining gases out of the chamber once the mechanical pumps have done their work. The two types of pumps work together to eliminate all but the tiniest trace of air in the chamber, down to about a billionth of Earth's normal atmospheric pressure.

Because the Webb telescope is operating in the infrared portion of the electromagnetic spectrum it is designed to operate at very cold temperatures. To simulate this environment an additional cooling system, a helium refrigeration system, was added so the thermal vacuum chamber could reach temperatures in the -413 Fahrenheit (F) range. "The ISIM structure was tested in our thermal vacuum chamber down to about 26 Kelvin, or minus 413 F," said Jon F. Lawrence, Webb telescope Mechanical Systems Lead Engineer/Launch Vehicle Liaison at NASA Goddard.

This test program starts at the lowest level of assembly, instrument or spacecraft components and is repeated at each next level of assembly. Once the instruments pass these tests they are all put together into the structure which holds them and the unit is tested again. The instrument structure is connected to the telescope and the whole observatory is tested yet again. There isn't a vacuum chamber large enough to hold the entire Webb observatory at NASA Goddard, so the telescope will travel to NASA's Johnson Space Center in Houston, Texas, to be tested in a chamber that was originally built for testing the Apollo command module to simulate the trip to the moon. The next stop after that is launch into deep space.

Currently, ETUs or actual flight hardware for the Webb telescope are being tested in various ways.

The James Webb Space Telescope is the next-generation premier space observatory, exploring deep space phenomena from distant galaxies to nearby planets and stars. The Webb Telescope will give scientists clues about the formation of the universe and the evolution of our own solar system, from the first light after the Big Bang to the formation of star systems capable of supporting life on planets like Earth.

The Webb Telescope project is managed at NASA's Goddard Space Flight Center in Greenbelt, Md. The telescope is a joint project of NASA, the European Space Agency and the Canadian Space Agency.

For information about NASA's James Webb Space Telescope, visit:
http://www.jwst.nasa.gov/

To see photos of the Full-scale model on the National Mall in Washington, D.C. May 10-12, 2007, visit:
http://www.flickr.com/photos/scifilaura/sets/72157600211237270/

Rob Gutro
NASA's Goddard Space Flight Center, Greenbelt, Md.

10 new technologies that will revolutionise your life !

Staying up to date with everything that's going on in PCs and tech is almost impossible, so these are the ten technologies that you should be most aware of, as they're the ones that'll make the biggest difference to your life.
1. 3D gaming
The fact that to get any kind of 3D image from a 2D screen means wearing a pair of sunglasses or worse means that three dimensional gaming isn't quite as convincing as multitouch and natural user interfaces, even though the two have been commoditised at almost the same time.
An Acer Aspire 5738 laptop with a 3D display costs about £550 at the moment, not bad for something with cutting edge technology that adds depth to any DirectX 9 game. The screen is of the polarised filter type, which is the new norm for extra dimensions.


Instead of using coloured filters splitting an image into two – one for each eye – the vertical pixel columns are alternated between left image and right image and shone through a piece of polarised glass. A pair of dark glasses with oppositely polarised lenses ensures that only one image is seen by each eye. The difference to a game is tangible too, something like WoW runs and looks incredible on the low-end graphics hardware.
It's over in TV land that the real push for 3D is happening, though, as LCD suppliers ask us to upgrade again to watch hyper-real cinema in the lounge. Compared to the other technologies we've talked about here, though, 3D requires a lot of effort on behalf of the watcher (those pesky glasses) and most of us are very lazy; hence the ubiquity of MP3 and standard definition movies, while Blu-ray and higher resolution sound standards continue to flounder. We value ease of use over quality every time.

In its favour, 3D doesn't actually require any work on behalf of games developers or publishers, as the stereoscopic image is created at the driver level. On the other hand, that means there's no massive push by the people who make and sell games to encourage us to adopt it.
2. Streaming games
The advancements in superfast broadband hasn't just helped the cause of downloadable games. It will also have no some small impact on the future of streaming games over broadband, or at least that's the theory.
There are several companies pursuing and a significant amount of money invested in the idea that one day, your precious PC will be almost entirely redundant as a games machine.
The concept is simple: all the game's data is hosted on a central server and all you will have to do is receive the display and send back input commands. It's a little like the technology used for MMORPGs, except that the rendering engine isn't on your PC, it's actually in the same server farm as the core intelligence.
This idea was actually mooted some years ago with the Phantom console, which never made it to the stores. It's looking unlikely that OnLive (www. onlive.com), Gaikai and Microsoft's own streaming project will end up as vapourware though, despite the obvious concerns about input lag: the delay that occurs every time you press a key. The signal has to travel hundreds of miles before a character even moves.
Proponents say that even twitch gaming FPS games are possible but we're a little more sceptical. There's another reason that at least one of these services will be properly launched soon, and that's vested interest by games publishers.
Because no content is stored on your machine, of course, it's impossible to pirate a streamed game, which is obviously an attractive proposition for them. In the immediate future, though, it is more likely to be a technology that runs like games such as Quake Live, which use a combination of some local processing power and some server-based cycles. That's certainly the route Microsoft is taking, and seems more achievable than relying on 'the cloud' at this stage.
3. Six-core processors
You won't have to wait long for this one. Intel's Westmere CPUs may be hanging around with the dregs of processor society at the moment, chucking their chips in with the integrated graphics crowd, but they're about to grow up – and fast.
Sometime over the next few months Intel will go two better than the current line up of quad-core CPUs by launching a six-core version of its high-end Core i7 line. Based on the existing Nehalem architecture, the headline feature is a process shrink down to 32nm, while the rest of the spec sheet remains largely the same. It could be a genuine upgrade.

Games programmers are getting much better at working with multithreaded code so that most major titles, like Empire: Total War and its forthcoming sequel Napoleon, will see a much bigger performance increase when given extra cores to play around in than the often sporadic leaps in frame rate we saw going from two to four cores.
Because the benefits will be in the amount of cores, rather than the speed of things you can do at once, Intel are encouraging some developers to add extra content specifically for people with a six-core CPU. Given the plethora of disappointments we've had lately with almost every technology that's promised to increase our frame rates, we'll reserve judgement until we have one in the office.
The good news is that these hexa-powered processors will fit into most existing X58 motherboards after a simple BIOS flash. The bad news is X58 motherboards are still very expensive too.
4. Wireless power
A few years ago we saw a demonstration by a team at Fulton Innovation of a product called eCoupled. Using the principle of electromagnetic induction, by which an electrical charge can be stimulated in a wire coil by placing another one nearby, the crazy boffins were able to display wireless power transfer.
Despite being high voltage, they said, it was safe, efficient and could be applied to any surface. The demo room consisted of a kitchen without plugs, but full of lights that could be stuck anywhere and a frying pan that heated up just by sitting it on the counter. Put a phone on the same counter and it began charging. Clearly, this was the future.


Fulton are still working on wireless power, but it's a different company that's beat it to the shops, Powermat – and its products are expensive for something that replaces a 50p mains plug.
The good news is that the Wireless Power Consortium are going to be finalising a standard for wireless power called Qi later this year, which should mean prices drop and manufacturers have the confidence to build the technology straight into devices, rather than requiring an adaptor.
If you think that's crazy, though, take a look at Airnergy by RCA. It's a tiny dongle that can turn Wi-Fi signals back into electricity for charging phone batteries and the like.
5. Wireless displays
The last two standards for monitors, HDMI and DisplayPort, didn't exactly have us all rushing out to upgrade our PC screens and graphics cards, so it's a safe bet that DVI will remain the cabled interface of choice for some time to come. What about connecting a monitor to your PC without wires though?
That's something that could be worth shelling out for. Two different technologies were on the show floor at CES, which should be available en masse this year.
The first, WirelessHD is being pushed by the usual line-up of TV and DVD player manufacturers as a replacement for HDMI. It uses a short range, high bandwidth in the Ultra-Wide Band (UWB) spectrum to transmit HD video and audio from a set-top box or media centre to a TV screen.
The idea is nothing new, Philips have had a kit out for a while that does the same thing, but WirelessHD is a proper standard and should ensure maker A's TV works nicely with maker B's Blu-ray machine and so on.
Perhaps more relevant for us, though, is Intel's new Wireless Display, or WiDi. It's designed specifically for laptops in order to remove the hassle of cables when you want to dock them with a proper screen, and like WirelessHD sends the video signal to a receiver box.
Unlike WirelessHD, WiDi can't handle protected content and the like, but it is much simpler since it requires no new hardware inside the laptop. Instead of using a separate transmitter, WiDi is a software layer on top of the existing Wi-Fi chip, so it's much cheaper to produce. Providing there's no latency introduced to the picture refresh rate, this could be a killer.


6. OLED displays
Yeah, we hear you. Another year, another promise that OLED screens are going to take over. Haven't we heard it all before? Except this time it could be true.
Google's Nexus phone has just launched with an OLED screen, and by all accounts it makes the handset almost – say it in hushed tones – more desirable than the iPhone. Brighter colours, sharper resolutions, darker blacks, whiter whites; why is this OLED technology so superior?
Put simply, it's because instead of filtering the light from a white or blue lamp behind the screen, each pixel in an OLED panel produces its own light. You don't have to be an optometrist to see why this is better, but it is much more expensive to produce.
Still, it also means OLED screens are much thinner than backlit ones, for obvious reasons, and while you may not be using an OLED PC monitor by the end of the year there are a lot more laptops with the technology arriving.
7. Superfast broadband
There are two things about broadband you should be concerned about. The first is whether or not the Digital Britain report, with its three strikes policy, outrageous invasion of privacy and extra charging for bandwidth, makes it into law before the general election final hits.
The second is what's going on at your exchange. By early next year, 75 per cent of us should be living in proximity to a telephone exchange that has a fibre optic connection to the internet. It's all part of BT's 21CN project to replace the entire copper telephone and broadband internet infrastructure with a single ethernet-based network fit for the 21st century.

So far, it's been dogged by delays and problems, but it's finally picking up the pace and is being tested by ISPs all over the country. The idea is that it will increase competition for high-speed broadband and bring down access prices, as well as bring services like IPTV – of the sort Virgin customers enjoy – to everyone.

It doesn't just mean better access to large downloadable game files and lower ping times, however. Our biggest hope is that it will eventually encourage telephone companies to do away with the irritating £12 a month line rental charge for a phone we don't actually use.
8. Augmented reality
Actually, we're kidding ourselves with this one. Augmented reality: the ability to overlay information on a live video feed of the world, is very cool, and it's impossible not to love iPhone apps like Yelp that pull in details and distances to the nearest pub or restaurants as you point the camera in their general direction.
Holding your phone three inches in front of your face as you're walking around feels a little too ridiculous to catch on, though. Perhaps it's like handsfree and Bluetooth headsets. Not so long ago people still sniggered if they saw someone using a phone without holding it to their ear, and not so long before that mobile phones themselves were devices for sales dorks.
Augmented reality
In a couple of years time, it may seem the most natural thing in the world to see someone walking around with a phone held at arms length, directing them to food or drink with their own personal dynamic GPS system, or pulling up interesting information about the buildings and people in front of them. Yes, that's right, people.
Twitter 360 is an iPhone app that directs you to geotagged tweeters on your friends list, while TAT (www.tat.se) is working on an Augmented ID program, so if people point a camera at you various bits of information from your social feeds floating around your head. Makes stalking a lot easier then. Scary.
9. Natural user interface
In his CES keynote presentation, Steve Ballmer made several references to the 'Natural User Interface' (NUI), which is a handy catch all to describe all the Wii wand-waving, multitouch point and Project Natal-style aerobics that are catching headlines out there.
The keyboard and mouse is by no means dead, but the sudden flood of cheap laptops and all-in-ones with a built-in, multitouch screen suggests that it won't be long before we'll all have something a little bit different on our desktops.
Over in the US, for example, custom laptop maker IBuyPower has already started selling high-end gaming notebooks with a multitouch screen, and French developer, Eugen Systems has incorporated Win7's multitouch controls into the heart of its forthcoming strategy title R.U.S.E. It's all very exciting, except for one thing.
Multitouch may be native to Win7 and no other operating system, but the implementation is nowhere near as smooth as it is on, say, the iPhone. PCFormat has yet to use a multitouch application on a PC that doesn't suffer from a bit of inaccuracy or sluggishness, and the key to the NUI is in the first word. It has to feel natural, unforced and invisible to the end user. That's what using multitouch on an iPhone is like, and that's what Windows must achieve. If the mouse remains faster and more trusted, that's what people will use.
There are some brilliant ideas out there, though. Project Natal, Microsoft's full body 3D gesture recognition system for Xbox 360, is by far the most ambitious prototype, and we can't wait for a PC hack.
At CES the prototype Light Touch projector, from Light Blue Optics (lightblueoptics.com), was a show stealer. Using a technique called holographic laser projection, this tiny projector turns any 10inch surface – flat or curved - into a sharp multitouch screen.
10. Long-term evolution
The idea of getting high speed, super-reliable mobile broadband from a cell tower to your laptop or phone via WiMAX is not quite dead in the water, but it's certainly in need of a bit of mouth to mouth.
Far from being the 'Next Big Thing' as it was touted a few years back, it's had a painful and traumatic incubation period, which has seen some US carriers begin to adopt it and, in fact, quite a few businesses use it for point to point communications, but public access has dwindled from trial areas to almost nothing.
Partly, this may be because the company which owns the licence to operate WiMAX licences in several cities, Freedom4, was recently bought out and the new owners aren't in any rush to monetise the technology. More likely, it's because the mobile phone companies are happy with the current HSPDA speeds and are betting on an alternative technology, which is known as 4G, or Long Term Evolution (LTE) to supply almost the same amount of bandwidth without completely reworking their networks for WiMAX.
Lucky Scandinavians living in Stockholm or Oslo with a TeliaSonera contract can already sign up for LTE, while O2 is planning on launching a 150Mbps LTE package in the UK some time this year. We don't expect WiMAX to give up without a serious fight, though.
In the US, mobile networks are beginning to fall over because of the volume of 3G traffic running over them, and WiMAX's new architecture could well be a way to increase capacity to cope with demand. In which case, expect to see it begin sprouting up everywhere.
Faster bits and bytes
Rather more prosaic than the technologies listed elsewhere in this article the internals of your PC are also being overhauled by the powers that be. There's a revision to the SATA standard out for disk drives, and USB 3.0 is appearing on motherboards to speed up the default peripheral connection.
They are big steps forward. SATA III doubles the bandwidth available to storage from a theoretical 3Gbps to 6Gbps, while on paper USB 3.0 is a ten-fold increase from 480Mbps to 4.8Gbps for cabled peripherals.
motherboardsMotherboards sporting ports of both flavours are already available from most manufacturers. Although both technologies are much faster than their predecessors, neither is likely to have a huge impact on consumer PCs.
In the world of business where milliseconds are money, the upgrades may mean something, but for the likes of us, compatible drives and peripherals will be a while coming yet.


Related Posts with Thumbnails