January 31, 2012

How Touch Screen Works?


Touch-screen monitors have become more and more commonplace as their price has steadily dropped over the past decade. There are three basic systems that are used to recognize a person's touch:
  • Resistive
  • Capacitive
  • Surface acoustic wave
The resistive system consists of a normal glass panel that is covered with a conductive and a resistive metallic layer. These two layers are held apart by spacers, and a scratch-resistant layer is placed on top of the whole setup. An electrical current runs through the two layers while the monitor is operational. When a user touches the screen, the two layers make contact in that exact spot. The change in the electrical field is noted and the coordinates of the point of contact are calculated by the computer. Once the coordinates are known, a special driver translates the touch into something that the operating system can understand, much as a computer mouse driver translates a mouse's movements into a click or a drag.
In the capacitive system, a layer that stores electrical charge is placed on the glass panel of the monitor. When a user touches the monitor with his or her finger, some of the charge is transferred to the user, so the charge on the capacitive layer decreases. This decrease is measured in circuits located at each corner of the monitor. The computer calculates, from the relative differences in charge at each corner, exactly where the touch event took place and then relays that information to the touch-screen driver software. One advantage that the capacitive system has over the resistive system is that it transmits almost 90 percent of the light  from the monitor, whereas the resistive system only transmits about 75 percent. This gives the capacitive system a much clearer picture than the resistive system.
On the monitor of a surface acoustic wave system, two transducers (one receiving and one sending) are placed along the x and y axes of the monitor's glass plate. Also placed on the glass are reflectors -- they reflect an electrical signal sent from one transducer to the other. The receiving transducer is able to tell if the wave has been disturbed by a touch event at any instant, and can locate it accordingly. The wave setup has no metallic layers on the screen, allowing for 100-percent light throughput and perfect image clarity. This makes the surface acoustic wave system best for displaying detailed graphics (both other systems have significant degradation in clarity).
Another area in which the systems differ is in which stimuli will register as a touch event. A resistive system registers a touch as long as the two layers make contact, which means that it doesn't matter if you touch it with your finger or a rubber ball. A capacitive system, on the other hand, must have a conductive input, usually your finger, in order to register a touch. The surface acoustic wave system works much like the resistive system, allowing a touch with almost any object -- except hard and small objects like a pen tip.
As far as price, the resistive system is the cheapest; its clarity is the lowest of the three, and its layers can be damaged by sharp objects. The surface acoustic wave setup is usually the most expensive.

source: www.howstuffworks.com

Apple MacBook Airs Vs Ultrabooks: what are major differences


Ultrabooks are creating waves in tech markets. Even the 2012 Consumer Electronics Show was dominated by lots of fresh Ultrabooks from various computer makers. Apple was the first computer maker that started slim revolution in laptops with its MacBook Airs in 2008. Later, Intel coined up the concept of Ultrabooks, which are notebooks with Intel’s high-performing 2nd gen Core i-series CPUs, thin and lightweight body and SSD storage options.
Over the last several months, almost all leading computer makers have launched Ultrabooks that comply with the standards set by Intel. Asus, Acer, Samsung, Toshiba Lenovo, HP and Dell are some of those Ultrabook makers. Here we compare the new Windows-based Ultrabooks with Apple’s MacBook Airs, which come in two screen sizes 11-inch and 13-inch.
Intel Core processors
Most of new Ultrabooks, which were announced at CES 2012, highlight Intel’s third generation Core i-Series Ivy Bridge processors. MacBook Airs, which were upgraded last year, run on Intel’s Sandy Bridge processors. The new Intel processor technology provides 30 percent better graphic performance and 20 percent better processor performance compared with the Sandy Bridge processors. In addition, the new processors bring USB 3.0 and PCI Express 3.0 standard support to Ultrabooks. Apple MacBook Air falls short to Ultrabooks here; however, Apple may soon upgrade its flagship notebooks with the Ivy Bridge processors.
Affordable prices
There is a huge variety of Ultrabooks out in stores. Various tech makers have launched their own products and so you can find Ultrabooks with different price tags. Models like Acer Aspire S3, Toshiba Portege Z835-P330 and many others come for a price under $1000. It is when Apple sells its MacBook Air’s higher end version for $1,299. It is a big threat for the Cupertino technology maker and it will be forced to cut its price tags in future.
Windows 8
All upcoming Ultrabooks will run Microsoft Windows 8, which is touted as the most advanced version of Windows operating system. Windows 8 is Microsoft’s first unified Windows version, which will support both a tablet and notebook. The new Windows 8 is also touch-enabled and has a tremendous Metro-style interface. A state-of-the-art Windows App Store will add into the strength of Windows 8. Meanwhile, MacBook Airs run Mac OS X 10.7 Lion, Apple’s latest Mac OS version.
Long battery life
Intel’s Ultrabook guidelines request tech makers for a back up of five hours of battery life for their notebooks. In fact, this battery life standard seems to have set by the performance MacBook Air has been offering. Whatever, many of high end forthcoming and available Ultrabooks offer long battery life than a MacBook Air.
Storage capacity
It is yet another area Ultrabooks have simply surpassed the levels set by Apple MacBook Air. You have 128GB of SSD with a Core i5 CPU version of MacBook Air. Meanwhile, many Ultrabooks come with additional hard drives besides SSDs for enhanced storage. Some devices also provide better SSD storage itself.
Portability
MacBook Airs might be the thinnest and lightest notebooks we had for a long time. But the Ultrabooks revolution has brought up many slimmest notebooks from many manufactures. Moreover, some Ultrabooks come with less weight and instant on technology to compete with MacBook Airs in almost all aspects.
What others say?
PCMag’s Brian Westover, comparing the Ultrabooks with Apple MacBook Air, says that even the first flock of Ultrabooks could hoist critical threat to MacBook Airs.
“There are certainly compelling reasons not to dismiss the Apple MacBook Air, but when all things are taken into consideration, the first crop of Ultrabooks makes a strong showing and gives the Apple MacBook Air a run for its money.”
On the other hand, Craig Simms of CNet says MacBook Air seems to be the favorite and best notebook we can find on market even it is filled with lots of svelte notebooks.
“So which would we go for? It’s still the MacBook Air, despite all the new svelte laptops vying for the crown. The combination of usability, build quality and performance ensures it’s still our favourite thin and light laptop.”
Sum-up
It is now up to you to decide whether you want an Ultrabook or Apple MacBook Air. Indeed, as far as performance, thinness, battery life and price are considered, we have lots of better Ultrabook options in stores rather than MacBook Airs.

source:http://nvonews.com/

January 12, 2012

Consume Less Energy With Blade Servers


In this extremely competitive market, well-informed business owners know that reducing power consumption is a smart and environmentally friendly way to cut operating expenses without reducing product quality or employee output. Switching IT processing to a Dell Blade system can save money while increasing productivity.

Improved Design

Traditional rack servers bundle components into individual cabinets along with separate energy consuming devices like graphics cards and keyboards. Each server requires a power source and extensive cabling and patching. A single optimized blade system can replace an entire room of rack servers and operate from a solitary source of power. The innovative design improves the effectiveness of internal fans and cooling systems, reducing current draw by as much as 65 percent for a fully loaded blade chassis over similarly configured rack systems. Since a single cabinet houses an entire block of servers, the need for external cooling systems or “cold rooms” drops as well.

Virtualization

Server virtualization improves power consumption by reducing the hardware requirements of businesses and institutions. A blade server can comfortably maintain information and applications that were formerly housed in a dozen or more rack servers. Standard servers that run only one application often operate at or below 20 percent capacity. Blade systems maximize efficiency by ensuring that each server component is working to its full capability. By allowing systems and applications to network freely between servers, blade systems increase productivity and eliminate unused or underutilized CPU storage space. Data transfers easily between machines. Once configured, individual components can be removed, repaired and/or replaced while the system is fully operating, decreasing the need for shutdowns.

Increased IT Performance

While utilizing blade servers increases the energy efficiency of your computer operations, they also reduce the energy expended by your IT personnel. Blade systems are easy to install, easy to manage and easy to maintain. Switching to a virtualized blade server limits end user responsibility, employee calls to IT, and the resulting time consumed in solving simple operator errors. Since they successfully integrate interfaces among many servers, blade systems also reduce the time spent in updating or installing new software and in reconfiguring applications.
Once employed exclusively by large corporations, small and mid-sized businesses are embracing new technology to become more environmentally friendly while increasing productivity and cutting costs. Dell blade servers can take your business to a higher level and a bite out of your utility bill.

source:http://www.techpark.net

January 9, 2012

Cloud Hosting for ZEROES


What is cloud hosting?

If you run a website, there are two kinds of cloud computing you might have heard about: cloud servers and cloud-based content delivery networks (CDNs). What are they, how do they work, and what benefits do they bring?

Cloud servers

Traditionally, web hosting came in two flavors: high-cost managed hosting, in which you have your own private server (an actual computer!) dedicated to running only your website and its applications, and low-cost shared hosting, where your site and apps run on a large server with a number of other sites operated by other people. Now there's a third option, widely marketed as cloud hosting in which your site runs on a virtual server somewhere up in the cloud; depending on how it's set up, a cloud server might be an actual computer, but it's just as likely to be a chunk of a much bigger machine—as with other kinds of cloud computing, the point is that it shouldn't matter either way to you as an end user. Rackspace's Cloud Servers, Liquid Web's Storm on Demand, and Amazon's Elastic Compute Cloud (EC2) are three examples of this kind of cloud hosting—and there are many more.

An example cloud server

So what's a cloud server like in practice? It's relatively easy to sign up to cloud services and see for yourself. With Storm on Demand, one of the cloud services I've used, you simply create a billing account and then tick the kind of server you want from a list of common examples (running from 1GB memory and 1CPU up to 96GB memory and 32 CPUs). Then you tick the "server image" (essentially the software you want on the server at startup, including the operating system) and specify whether you want a managed or self-managed server. Finally, you specify whether you want backups of your data and how you'll pay for bandwidth (either in large, specified blocks of GB or per GB used). When that's all done, you click to create the server and it's all "built" for you, on the fly, in a matter of minutes.
You can scale up or down any of the parameters you've chosen at any time (so at a time of peak demand—an end-of-season sale, perhaps—you could double or triple the power of your machine for a week or two before scaling back down again when traffic returns to normal).
Once the server's created, you can configure it in the usual way (just like a physical server) with software like WHM and cPanel—or however you wish. If you decide you no longer want your server you can destroy it just as easily, and you simply pay for what you've used (an hourly rate for the server and a per GB rate for the bandwidth). It's extremely easy to use. Even with only previous experience of shared hosting and no previous experience of WHM whatsoever, I had this website up and running on a Storm cloud server in a couple of hours.
Photos: Liquid Web's Storm on Demand allows you to set up a cloud server in a matter of minutes, simply by ticking a few boxes. Every aspect of the service is pay-as-you-go. It's easy to use even if you have little or no experience of setting up or managing dedicated servers.

Cloud servers or virtual servers?

How do cloud servers work? The important thing to remember is that "cloud server" is essentially a marketing term and not a technical description or explanation. Hosting products described as "cloud servers" are generally virtual slices of large, physical servers running what's called virtualization software(the most common types being VMware® and Xen® hypervisor for Linux and Microsoft® Hyper-V™ for Windows). In other words, they are effectively "virtual servers" (entirely independent virtual machines) running on a real, physical server. How is that different from shared hosting? The virtual servers are essentially independent of one another (though they do use the same processors and memory), so you're not at risk from other people's applications or websites. You have full root access to your virtual server (unlike on shared hosting, where different users' files are simply subdirectories of a single server running a single operating system), and you can reboot or reimage, as you wish—you can even run entirely different operating systems on the same physical server. The main benefit of using virtualization is reducing the number of physical servers you have to buy and manage. However, that doesn't necessarily translate into the cost savings you might expect because support costs may be higher and you may still need multiple software licenses for each virtual server.

Cloud-based content delivery networks (CDNs)

Your website can benefit hugely from cloud computing even if you don't want to migrate it to a cloud server. Information-rich sites like this one, with a lot of static content, typically use over 90 percent of their bandwidth serving up images (and other media) and CSS files that probably don't change from one month to the next. With traffic split equally between Europe, America, and Asia, there's no easy way to decide where to locate your main server: wherever you choose, some users will benefit and others will lose out. But putting the static content on a content delivery network (CDN) , dispersed across the cloud, will benefit everyone. Simply speaking, a CDN makes multiple copies of your static files and stores them at key "edge locations" around the world so that different users in different continents receive whichever files are nearest (and therefore quickest to download).

How do you set up a CDN in practice?

Suppose you want to speed up your website by moving all your images on to a CDN. You can sign up for a pay-as-go CDN in a matter of minutes (Amazon's Cloudfront and Rackspace Cloud Files are two popular, instant options, but there are plenty of others). Once you've sorted out the billing, you simply upload your files (in a similar way to using FTP) and you'll receive a web address (such as abcdefg123456789.cloudservice.whatever) that you can use to link to them. You can either use this address explicitly (referring to it directly in your IMG tags) or (more sensibly) refer to it through a CNAME (effectively a DNS alias) based on your own domain name. When people download your pages, the images are no longer pulled from your main server but from one of the edge locations around the world—ideally one that's geographically close to where they happen to be.
How does it work behind the scenes? It's easy to see if you do a DNS lookup for whatever domain name you're using for your CDN. Instead of a single IP address, you'll find the name resolves to different IP addresses in different parts of the world. In other words, the files resolve to a different IP address depending on where the end user happens to be. So for a person on the West Coast of the United States, abcdefg123456789.cloudservice.whatever might resolve to a server in Mountain View, California, while for a user in Europe, the same domain might resolve to a server physically located in Paris, France or London, England.
Pros and cons? There is almost always a significant performance boost from moving to a CDN, but if you're paying a fixed-price for your web hosting (or server) bandwidth, it's probably going to work out as an extra cost. CDNs rely on your files being copied, periodically, from the central server where you upload them to the edge locations around the world where they're served to users and typically cached for anything from a few days to several weeks or more (you can generally specify the cache expiry time)—so file management and updating can sometimes be a problem. For example, suppose you set a 30-day cache on your main CSS file but suddenly want to change the way some aspect of your site is presented. You can either upload a new CSS file and wait up to 30 days for all the edge locations to reflect the change or rename your CSS file (and all the pages that reference it), then upload new versions. Either way, you lose a certain amount of flexibility in file management and it's important to remember that different users in different locations may see different versions of the same file for a period of time. That's why CDNs work best for static (rarely changing) content.

Worth a go?

One of the best things about cloud services is that they're generally pay-as-you-go—so it's very easy to try them out, at relatively little cost, and see what difference they make.

January 7, 2012

2012 Gadgets


LYTRO CAMERAS

Entry-levels, SLRs, DSLRs, digicams, Micro Four Thirds – there's a lot of hullaballoo around the fancy cameras out there, but there might just be one device that could put an end to it all – the Lytro.
This small rectangular camera is made of anodised aluminium, making it lightweight yet sturdy. The USP of Lytro is the fact that it lets you change the focus of the picture after you click it.
And you can do this days, weeks or even years after they were shot. There is no conventional delay caused by the lens auto-focusing when you press the shutter button. The Lytro's compact design is driven by its 8x optical zoom lens, which features a constant f/2 aperture.
By using all of the available light in a scene, the Lytro performs well in lowlight environments without the use of a flash. And who wouldn't want this camera in their backpack when it weighs just about 220grams.
The Lytro is available in both 8GB and 16GB models, storing 350 and 750 pictures respectively.

NINTENDO WII U

You might already know what you want from Santa next Christmas - the next from the Nintendo stable is the ‘Wii U'.
This will be the first gaming console from Nintendo to deliver 1080p high-def graphics. The controller boasts a 6.2-inch touchscreen and a stylus to work on it.
The display on the controller will not support multi-touch, however, there are a host of sensors installed in it such as a gyroscope and accelerometer. The unit also has a built-in microphone, speakers, and a camera.
The console uses SD cards and flash memory to load and save games and doesn't come with a hard drive.
A preliminary list of games compatible with the Wii U is already out and includes some exciting names - Super Mario, the Legend of Zelda – Skyward Sword, Luigi's Mansion 2 and Kid Icarus: Uprising.

SMART GARMENTS

The concept of smartphones getting ‘smarter' sounds a little overdone. How about your shirt getting a new brain or two? Companies like Ultra Armor, that specialises in sports gear and clothing, has an ace up its sleeve. It has produced the Ultra Armor E39, a shirt that can track your athletic abilities.
The E39 tracks your biometric signals and transfers it to the ‘heart' of the shirt. The ‘heart' is a Physiological Status Monitor (PSM) technology, which the US Special Forces and connected health enthusiasts have already signed up for.

WINDOWS 8

By now, we've all seen the numerous screen grabs from the developer's version of Windows 8.
Not only does it look bright and enticing, it already has a couple of OEMs swearing by the Windows 8 experience. The Metro-style app, tiled on the homescreen make for an interesting interface and gives you updates on almost anything you want, right on the homescreen.
You'll have an exclusive Windows Store where you can download these Metro-style apps from. A major plus is the freedom to sign in through one's Windows Live account and not just a localised user profile. This means you will still have the same personalised settings and files on multiple devices running on the Windows 8 OS.
The latest buzzword around Windows 8 is ‘Picture Passwords'. The official blog says the interactive picture password will work in two ways. One, you get to choose a picture you like and a preset gesture on it. So if I have a picture of my pup, I can draw a circle or a heart around his face to unlock my Windows 8 tablet. How cute is that!

APPLE IPAD 3

The first Apple iPad was a gamechanger and the second just about a better version of its predecessor. What does the third have in store? Well, if rumours are to be trusted, we might just see not one but two new iPads – a higher-end version and a ‘budget' version.
The costlier one might ship with a 8-meg camera while the cheaper one might just sport a 5-megger. Both are rumoured to feature 9.7-inch screens but with an advanced retina display. Apple also filed a patent for facial recognition recently. This could result in a software that lets you use the front camera to log in to your profile, to use your chosen apps and settings on the iPad.
What's better, the higher end iPad might have a battery double the capacity of a regular one now. The regular ones are usually around 6,000 mAh. This could means days and days of unbarred tab usage!

SAMSUNG GALAXY S III

In 2011, the Samsung Galaxy S II was the ultimate phone of the year. Something that good absolutely deserves a quick successor considering how quickly things change in techland.
Not only is there talk about Samsung including its stunning Super AMOLED Plus display, quad-core processor and 2GB of RAM in the device but also a 3D display to gain an edge over the Apple iPhone. It might go a notch higher and have a 10-megger as its primary clicker. The handset will incorporate Google's Ice Cream Sandwich as the user interface.
It's not clear if the 3D version will only be a variant of the third Galaxy S iteration. But, if you go by the quality and success of 3D smartphones already in the market, Samsung might want to keep its options open.
For more on gadgets, visit http://www.thehindubusinessline.com/features/smartbuy/

January 4, 2012

Importance of Back-up


We’ve all been there, sat staring at the computer with our hands behind our heads in dismay as our hard drive fails and all those brilliant holiday snaps are lost and gone forever. You know that no amount of turning it off and on again will help recover them, but you do it anyway.  
It is then that you realize that you have all your passwords saved in one vital document on your desktop and that all-important presentation you have to do tomorrow morning at work has disappeared into the world of unrecoverable data.
This avoidable catastrophe always involves a lot of swearing involved, a lot of turning the computer off at the mains and giving it a good crack to give it some encouragement to whirr back to life. Yeah, it’s happened to the best of us, and we’ve all learnt from our mistakes and invested in some form of backup from Western Digital or an online cloud drive. Next time you will too, right?
We thought as much, so here’s our simple guide of four easy options to backup and save yourself the stress and even money of trying to recover lost files.

CD/DVD

Copying important files onto CDs and DVDs is a simple, if almost prehistoric, way of lowering the risk of losing everything. However, if you backup as often as you should, you’ll end up with piles and piles of old CDs with old files all over your office or spare bedroom. From bookcases, to shelves, to drawers and floors covered from top to bottom in dust-gathering discs, you’ll be surrounded by a sea of data that will take hours to filter through to find what you need.

USB sticks

Uploading your files to a small flash drive is much simpler than creating and sorting through multiple discs. However, with small amounts of memory you’ll have to use several to backup everything you need to. If you choose to use mini flash drives to store all of your photos, music and work files, keep them in a safe place as they are easily lost under car seats, in handbags and in the pub.

External hard drive

Attach a Western Digital external hard drive to your computer to backup your files with simplicity and ease. Just keep the hard drive hooked up to your PC or Mac and let the hardware do the hard work. External hard drives can continuously update and backup everything, so you’ll never need to worry if you forget to drag something across.

Online backup

If you don’t want to backup your backup, go for an online service which will come hand-in-hand with yearly fees in return for peace of mind. This is the easy solution for anyone who constantly forgets to keep their documents safe. There are some free options available online, but they often don’t give you the luxury of a full computer backup.

source:www.techgeeze.com

Top 10 Most Anticipated Gadgets of year 2012


It’s that time of the year again, and we’re not talking about Christmas here. It is that time where the beginning of the new year means a whole new slew of gadgets to look forward to. Now, as far as 2011 is concerned, many were thrilled upon the news of the iPad 2 and many were also disappointed with the iPhone 4S not being the iPhone 5. For the other end of the spectrum, we got a slew of Android devices like the Galaxy Nexus and Galaxy Note, as well as the release of Ice Cream Sandwich to end the year. 2011 also saw the tie-up of Microsoft and Nokia to hopefully bring in new gadgets, as well as bring Nokia some hard-needed revenue. With all of these said, what could we expect in 2012, the year when the world’s supposed to end?
iPhone 5
Despite the iPhone 4S being a great device overall, the lack of physical changes, as well as the simple addition of “S” to the predecessor’s moniker made people skip this release. It is in this vein that many are really anticipating and eagerly waiting for an “actual” new iPhone. Speculations like the aluminum backing and a bigger 4 inch screen is making many excited. Will Apple actually make this wishful thinking a reality?
iPad 3
The iPad was a sure game-changer when it came out a couple of years ago. Then it just got a lot better when the iPad 2 got released during the first half of 2011. So how will they improve on this tried and tested device/concept with its third iteration? Add to that the fact that we lost Steve Jobs late in 2011, will there be chances that the iPad will go on a different route under a different direction? Well, everyone’s hoping for a quad-core processor to power this thing, making it the next best mobile gaming device. Again, another wishful thinking for the Apple fans?
Samsung Galaxy SIII
Ok, don’t get us wrong on this, but while the Galaxy Nexus is a nice phone in itself, it doesn’t have the vibe that the Galaxy S, or the SII had when they were released. Given that Samsung really got a lot with the SII, selling around 10 million copies of the phone, many are left to predict that they’ll continue the winning formula with the third installment of the Galaxy S series of phones – hopefully to catch up with Apple and it’s flagship device.
Xbox 720?
It’s been a while since the latest generation of gaming consoles were released and we believe that they’re nearing the end of their cycles. With that said, the loudest of all these rumored next-generation consoles is Microsoft’s Xbox 720. It’s a no brainer how they came up with that name but here’s the rumored specs for the console. It is said to have an 8-core processor, supports 3D, multiTV, live TV and webcam functionality (i.e. a built-in Kinect?). In any case, Microsoft is rumored to release the 360′s successor in 2012.
Nintendo Wii U
Being at a technological disadvantage against the Playstation 3 and Xbox 360 didn’t stop the Wii from being a household name in gaming and entertainment overall. It’s focus on interactivity even forced the competition to release products similar to the Wii’s concept. So how will Nintendo up their game? Simple, just add a “U” to the Wii. Yeah, that name still makes us do weird reactions, but Nintendo is betting that the Wii U will once again dominate the gaming scene with the addition of a new controller with a built-in 6-inch touch screen among the other improvements they’ve placed in it. Nintendo is set to release the Wii U in March 2012 and until then, we can only speculate of how it will fare.
PlayStation Vita
Trying to forget the gadget that was the PSP Go, Sony is eyeing on the success that the PlayStation Vita will bring as much as the original had which sold almost 90 million units in its entirety. And this is one small beast never to be underestimated with its quad-core processor a front touch screen and a rear touch pad, two analog sticks and the socially-integrated UI that still is a lot of getting used to. And while the PSP Vita recently got a Japanese release in mid-December, those in the western hemisphere would have to wait until February to get their hands on this gaming device.
Quad-Core Mobile Phones/Tablets
With Nvidia introducing their Tegra 3 quad-core processor, gone are the days where desktops and high-end laptops were the only ones to get the quad-core treatment. ASUS immediately jumped on the gun with their Transformer Prime (despite being sued for the name) and placed the Tegra 3 chipset to power this device. This proved to be a great choice as the Transformer Prime was indeed a very fast tablet and this sets the precedence for more quad-core devices to come out of the market, hopefully in 2012. So yeah, mobile computing will really be a notch higher with these kinds of devices coming.
MacBook Pro + Retina Display
There’s no denying that any Apple device that has Retina Display in it gives out the most beautiful of screen visuals. And people are actually wishing that devices other than the iPhone be equipped with the same screen. Specifically, they wanted their MacBook Pros to have a resolution on par with the Retina Display or something at a resolution of 2880 x 1800. Now recent rumors were pointing to this feature and people are hoping that they’ll be gettng this one in 2012. But again, keep your grains of salt handy.
"UltraBooks"
You’re probably wondering, "What the heck are UltraBooks?" Well, take the concept of NetBooks and make them thinner yet powerful. Not much is known of how portable or how powerful these UltraBooks may be, but the leading manufacturers (i.e. ASUS, Acer, Samsung) are further making their devices thinner and at the same time, with the dawn of the more powerful mobile devices, it is possible to have these cheaper laptops on higher-end specifications.
Apple HD-TV
The list makes a full circle with another possible offering from Apple. This time, it is the expected release of their Apple HD-TV. Yep, it’s not just that small box that you’ve known Apple TV to be. This is an actual TV set, ranging from 15 to 19 inches that is bundled with iOS inside. Add to that, you can actually play games, run apps and integrate social networking sites as much as you do with your iPhone or iPad. So yeah, even in TV sets, Samsung and Apple are destined to be rivals.

source:www.techgeeze.com

The Future of Your PC's Hardware (5) : Wireless Power Transmission


Wireless power transmission has been a dream since the days when Nikola Tesla imagined a world studded with enormous Tesla coils. But aside from advances in recharging electric toothbrushes, wireless power has so far failed to make significant inroads into consumer-level gear.
What is it? This summer, Intel researchers demonstrated a method--based on MIT research--for throwing electricity a distance of a few feet, without wires and without any dangers to bystanders (well, none that they know about yet). Intel calls the technology a "wireless resonant energy link," and it works by sending a specific, 10-MHz signal through a coil of wire; a similar, nearby coil of wire resonates in tune with the frequency, causing electrons to flow through that coil too. Though the design is primitive, it can light up a 60-watt bulb with 70 percent efficiency.
When is it coming? Numerous obstacles remain, the first of which is that the Intel project uses alternating current. To charge gadgets, we'd have to see a direct-current version, and the size of the apparatus would have to be considerably smaller. Numerous regulatory hurdles would likely have to be cleared in commercializing such a system, and it would have to be thoroughly vetted for safety concerns.
Assuming those all go reasonably well, such receiving circuitry could be integrated into the back of your laptop screen in roughly the next six to eight years. It would then be a simple matter for your local airport or even Starbucks to embed the companion power transmitters right into the walls so you can get a quick charge without ever opening up your laptop bag.

source:www.pcworld.com

January 3, 2012

The Future of Your PC's Hardware (4) : USB 3.0 Speeds Up Performance on External Devices


The USB connector has been one of the greatest success stories in the history of computing, with more than 2 billion USB-connected devices sold to date. But in an age of terabyte hard drives, the once-cool throughput of 480 megabits per second that a USB 2.0 device can realistically provide just doesn't cut it any longer.
What is it? USB 3.0 (aka "SuperSpeed USB") promises to increase performance by a factor of 10, pushing the theoretical maximum throughput of the connector all the way up to 4.8 gigabits per second, or processing roughly the equivalent of an entire CD-R disc every second. USB 3.0 devices will use a slightly different connector, but USB 3.0 ports are expected to be backward-compatible with current USB plugs, and vice versa. USB 3.0 should also greatly enhance the power efficiency of USB devices, while increasing the juice (nearly one full amp, up from 0.1 amps) available to them. That means faster charging times for your iPod--and probably even more bizarre USB-connected gear like the toy rocket launchers and beverage coolers that have been festooning people's desks.
When is it coming? The USB 3.0 spec is nearly finished, with consumer gear now predicted to come in 2010. Meanwhile, a host of competing high-speed plugs--DisplayPort, eSATA, and HDMI--will soon become commonplace on PCs, driven largely by the onset of high-def video. Even FireWire is looking at an imminent upgrade of up to 3.2 gbps performance. The port proliferation may make for a baffling landscape on the back of a new PC, but you will at least have plenty of high-performance options for hooking up peripherals.

source:www.pcworld.com

January 2, 2012

The Future of Your PC's Hardware (3) : Nehalem and Swift Chips Spell the End of Stand-Alone Graphics Boards


When AMD purchased graphics card maker ATI, most industry observers assumed that the combined company would start working on a CPU-GPU fusion. That work is further along than you may think.
What is it? While GPUs get tons of attention, discrete graphics boards are a comparative rarity among PC owners, as 75 percent of laptop users stick with good old integrated graphics, according to Mercury Research. Among the reasons: the extra cost of a discrete graphics card, the hassle of installing one, and its drain on the battery. Putting graphics functions right on the CPU eliminates all three issues.
Chip makers expect the performance of such on-die GPUs to fall somewhere between that of today's integrated graphics and stand-alone graphics boards--but eventually, experts believe, their performance could catch up and make discrete graphics obsolete. One potential idea is to devote, say, 4 cores in a 16-core CPU to graphics processing, which could make for blistering gaming experiences.
When is it coming? Intel's soon-to-come Nehalem chip includes graphics processing within the chip package, but off of the actual CPU die. AMD's Swift (aka the Shrike platform), the first product in its Fusion line, reportedly takes the same design approach, and is also currently on tap for 2009.
Putting the GPU directly on the same die as the CPU presents challenges--heat being a major one--but that doesn't mean those issues won't be worked out. Intel's two Nehalem follow-ups, Auburndale and Havendale, both slated for late 2009, may be the first chips to put a GPU and a CPU on one die, but the company isn't saying yet.
source:www.pcworld.com

January 1, 2012

The Future of Your PC's Hardware (2) : 32-Core CPUs From Intel and AMD


What is it? With the gigahertz race largely abandoned, both AMD and Intel are trying to pack more cores onto a die in order to continue to improve processing power and aid with multitasking operations. Miniaturizing chips further will be key to fitting these cores and other components into a limited space. Intel will roll out 32-nanometer processors (down from today's 45nm chips) in 2009.
When is it coming? Intel has been very good about sticking to its road map. A six-core CPU based on the Itanium design should be out imminently, when Intel then shifts focus to a brand-new architecture called Nehalem, to be marketed as Core i7. Core i7 will feature up to eight cores, with eight-core systems available in 2009 or 2010. (And an eight-core AMD project called Montreal is reportedly on tap for 2009.)
After that, the timeline gets fuzzy. Intel reportedly canceled a 32-core project called Keifer, slated for 2010, possibly because of its complexity (the company won't confirm this, though). That many cores requires a new way of dealing with memory; apparently you can't have 32 brains pulling out of one central pool of RAM. But we still expect cores to proliferate when the kinks are ironed out: 16 cores by 2011 or 2012 is plausible (when transistors are predicted to drop again in size to 22nm), with 32 cores by 2013 or 2014 easily within reach. Intel says "hundreds" of cores may come even farther down the line.
source:http://www.pcworld.com