Monday, October 20, 2014

Transforming the Future of Lighting Systems with Led Technology

Although the origin of Light Emitting Diode or LED technology can be traced back to 1927, it did not enter the commercial arena until much later. This was largely due to its high production cost; it is, however, rapidly gaining ground in more recent times. With the increasing demand for greener, more energy-efficient products, as well as the worldwide environmental strain on energy resources, is LED technology the answer to our lighting needs? What does the future hold for LED technology and does it have what it takes to overpower the traditional light bulb, which was perhaps the most lifechanging invention in human history?
LED lights are likely the most environmentally friendly lighting options available in the current scenario. So what makes it the leading choice in industrial, architectural, and horticultural applications around the world? LEDs last as much as 20 times longer than other lighting sources, and therefore don’t need to be replaced as often. This reduces the impact of manufacturing, packaging, and shipping. LEDs are also designed to provide more than a decade of near maintenance-free service. Less servicing also reduces environmental impact.
Additionally, LEDs consume much less energy than incandescent and high-intensity discharge (HID) lights. LED lights use only 2–17 watts of electricity, which is 25%–80% less energy than standard lighting systems. And while compact fluorescent lights are also energy-efficient, LEDs burn even less energy. LEDs contain no mercury, unlike their HID counterparts, whose mercury-laden remnants can seep into the water supply and adversely affect sea life, and those who consume it. 

According to the U.S. Department of Energy, “Widespread use of LED lighting has the greatest potential impact on energy savings in the United States. By 2027, widespread use of LEDs could save about 348 terawatt hours (compared to no LED use) of electricity: This is the equivalent annual electrical output of 44 large electric power plants (1000 megawatts each), and a total savings of more than $30 billion at today’s electricity prices.”

The electrical maintenance required for lighting systems in public buildings that receive harsh and prolonged use, sometimes 24 hours a day, 365 days of the year, is overwhelming. In public building management, time is money, and because changing LED fixtures happens far less often than traditional lighting, public building management will have to spend less time on the ladder changing bulbs. LED lighting contributes to energy savings and sustainability by improving working conditions through deliberately directed light and by reducing the energy needed to power lighting fixtures.
A groundbreaking advancement in this area came to the forefront when Isamu Akasaki, Professor at Meijo University, Hiroshi Amano, Professor at Nagoya University, and Shuki Nakahmura, a Japanese-born Professor currently at the University of California, Santa Barbara won the Nobel Prize in Physics earlier this month for inventing the world’s first blue light-emitting diodes (LEDs).
While red and green LEDs had been around for some time, the elusive blue LED represented a long-standing challenge for researchers in both academia and industry. Without this critical last piece, scientists were unable to produce white light from LEDs, as all three colors needed to be mixed together for this to happen.
The white LED lamps that resulted from this invention emit very bright white light and are superior in terms of energy efficiency and lifespan when compared with incandescent and fluorescent bulbs. LEDs can last some 100,000 hours, whereas incandescent bulbs typically last only about 1,000. “With 20% of the world’s electricity used for lighting, it’s been calculated that optimal use of LED lighting could reduce this to 4%,” said Dr. Frances Saunders, President of the Institute of Physics. “This is physics research that is having a direct impact on the grandest of scales, helping protect our environment, as well as turning up in our everyday electronic gadgets.”
What’s more, these LED lamps have the potential to improve the quality of life for more than 1.5 billion people in the world that do not have access to electricity grids. Since LEDs require very little energy input, they can run on cheap local solar power. 
Steven DenBaars, a research scientist at UC Santa Barbara, has been working on LED lights for 20 years. In his laboratory, he is already onto the next big thing: Replacing a substantial portion of indoor lights, and the archaic bulb and socket infrastructure on which they depend, with lasers.
According to DenBaars, the working of lasers is very similar to an LED lightbulb. “It’s the same materials, but you put two mirrors on either side of the LED and it breaks into a laser. Once you get reflection back and forth, you get an amplification effect, and it goes from regular emission to stimulated emission.”
Simply replacing the light emitting diodes in a typical LED bulb with a laser diode wouldn’t work. This hypothetical laser light bulb would catch on fire from all the waste heat it would generate, not to mention an ungodly amount of light, more than enough to blind anyone who looked at it. Rather, DenBaars imagines using just a handful of tiny but powerful lasers, and then redirecting their light into fiber-optic cables and other types of light-transmitting plastic that could take that light and evenly distribute it into a warm, diffuse glow.

BMW’s “hybrid supercar,” the i8, uses headlights that are the latest example of laser-based lighting technology. Like all lasers re-appropriated for conventional illumination, blue laser diodes were aimed at a phosphor that transforms the blue laser light into more diffused white light. The result is headlights with such a long working life that they could “easily outlive the automobile” in which they’re installed, notes IEEE Spectrum.
Laser lights could solve the problem of how to bridge the gap between traditional light sockets and more radical configurations of new lighting technologies. With just a few point sources of laser light installed in a building, their illumination can be redirected throughout a structure via plastic fiber-optic cables that could run along ceilings and around corners, just as the cable company runs its wires into buildings and through rooms without having to tear holes in walls or interface with the electrical system of a building. “Rather than route the electricity to the bulb you can route the light to the sources. LEDs let you do that too, but lasers would take it a couple steps further,” says DenBaars.
LEDs are helping change the way we light up our world, facilitating the development of environmentally friendly, energy-efficient light sources that offer a dramatic improvement from the incandescent bulbs pioneered at the beginning of the 20th century.

For our relevant BCC Research report on LED, visit the following link:
Bookmark and Share

Monday, October 13, 2014

Minding Our Brain’s Business: Unravelling The Neuroscience Of The Human Brain

For centuries, the mysteries of grey matter have baffled scientists and researchers alike. How can humans manage to store countless moments, past and present, in one single organ? How can animals map their path back and forth? How do we figure out a shortcut to work when there's a big traffic jam? How do we recall where we parked our car? So on and so forth.
The brain, as it turns out, has a GPS-like function that enables people to produce mental maps and navigate the world — a discovery for which husband-and-wife scientists Edvard Moser and May-Britt Moser of Norway, and New York-born researcher John O'Keefe were recently honored for breakthroughs in experiments on rats that could help pave the way for a better understanding of human diseases such as Alzheimer's. This solves the problem of how the brain creates a map of the space surrounding us and how we navigate our way through a complex environment. In other words, it reveals brain's internal positioning system and gives clues to how strokes and Alzheimer's affect the brain.
"We can actually begin to investigate what goes wrong" in Alzheimer's, said O'Keefe. "The findings might also help scientists design tests that can pick up the very earliest signs of the mind-robbing disease, whose victims lose their spatial memory and get easily lost," he added.
It was in London in 1971 where O'Keefe, conducting his research on rats, discovered the first component of the brain's positioning system. O'Keefe, now director at the center in neural circuits and behavior at University College London, found that a type of nerve cell in a brain region called the hippocampus was always activated when a rat was in a certain place in a room. Other nerve cells were activated when the rat was in other positions. O'Keefe, thereafter, concluded that these "place cells" were building up a map, not just registering visual input.
"I made the initial discovery over 40 years ago. It was met then with a lot of scepticism," the 74-year-old O'Keefe said. "And then slowly over years, the evidence accumulated. And I think it's a sign of recognition not only for myself and the work I did, but for the way in which the field has bloomed."
What is vital, however, is that the knowledge about the brain's positioning system can also help understand what causes loss of spatial awareness in stroke patients or those with brain diseases like dementia, of which Alzheimer's is the most common form and which affects 44 million people worldwide.
In 1996, Edvard Moser and May-Britt Moser, now based in scientific institutes in the Norwegian town of Trondheim, worked with O'Keefe to learn how to record the activity of cells in the hippocampus. In 2005, they identified another type of nerve cell in the entorhinal cortex region in the brains of rats that functions as a navigation system. These so-called "grid cells," they discovered, are constantly working to create a map of the outside world and are responsible for animals' knowing where they are, where they have been, and where they are going.
The Nobel Assembly said the laureates' discoveries marked a shift in scientists' understanding of how specialized cells work together to perform complex cognitive tasks. They have also opened new avenues for understanding cognitive functions such as memory, thinking, and planning.
The finding, a fundamental piece of research, explains how the brain works but does not have immediate implications for new medicines, since it does not set out a mechanism of action.
For our relevant BCC Research reports on Alzheimer’s, visit the following links:

Bookmark and Share

Friday, October 10, 2014

The Changing Face of the Healthcare Industry with Mhealth Technologies

The health industry is increasingly responding to the rising popularity and availability of technological innovations, such as tablets and smartphones. The use of connected devices to collect patient data, monitor ongoing conditions, access health information, and communicate with providers, patients, and peers is a trend that is spreading like wildfire. Health applications have the potential to be adapted and used by healthcare professionals and consumers, helping to revolutionize the sector and reflect the digital age we live in.
Mobile Health (mHealth) can provide cost-effective solutions within the global healthcare environment, which faces budget constraints, an increasing prevalence of chronic conditions, and a limited healthcare workforce. mHealth is the use of mobile and wireless technologies to support healthcare systems and achieve healthcare objectives.  After several successful global trials, several mHealth services have entered the commercialization phase and many mobile applications have been launched, stimulating partnerships with software developers, mobile operators, governmental and non-governmental organizations, and leading healthcare players.
Over the next decade innovations within the mHealth market will be driven by evolution of smartphone technologies, improvements in wireless coverage, and remote treatment and monitoring of prevalent chronic diseases. According to a BCC Research report, the global mHealth market reached nearly $1.5 billion in 2012 and $2.4 billion in 2013. It is expected to reach $21.5 billion in 2018 with a compound annual growth rate (CAGR) of 54.9% over the five-year period from 2013 to 2018.
For healthcare professionals, mobile or tablet apps also have enormous potential for training and professional development. Connectivity is built in, facilitating a blended learning platform with easily updatable information, in an accessible format. This allows for a truly flexible and enjoyable teaching and learning experience, ideal for both professionals and students, with information available anytime, anywhere.
Not only do health training and development apps provide more dynamic training tools, but they can also bring huge cost savings. Apps are inexpensive to produce and update, especially when compared to other training tools. Tablets and smartphones are readily available and the technology is relatively low cost when compared to other health technologies and professional training tools.
Apple’s new software, HealthKit, is designed to collect data from various health and fitness apps, making that data easily available to Apple users through the company’s new Health app. HealthKit is being developed to send data directly into hospital and doctors' charts, too.
Craig Federighi, Senior Vice President at Apple, at a conference held earlier this year, said, “Developers have created a vast array of healthcare devices and accompanying applications, everything from monitoring your activity level, to your heart rate, to your weight, and chronic medical conditions like high blood pressure and diabetes. ... [But] you can’t get a single comprehensive picture of your health situation. But now you can, with HealthKit.”
Mobile phone carriers such as Verizon and Sprint are also using their vast and trusted networks to bring mobile patient engagement and data to the forefront. “[Verizon’s] Converged Health Management is a perfect example of how we are using our unique combination of assets like our 4G LTE wireless network and cloud infrastructure to deliver an innovative, cost-effective and game-changing solution to the marketplace,” said John Stratton, President of Verizon Enterprise Solutions.
For relevant BCC Research reports on telemedicine technologies, visit the following links:
http://www.bccresearch.com/market-research/healthcare/telemedicine-technologies-report-hlc014g.html
Bookmark and Share

Wednesday, October 8, 2014

MISSION SAVE EARTH: EMERGING TRENDS IN ENVIRONMENTAL SENSING AND MONITORING TECHNOLOGIES

Environmental field monitoring technologies have advanced rapidly in the last decade, concurrent with advances in digital technology, computational power and Internet-enabled communications. Environmental sensors have become much smaller, faster and often less expensive. Advances in air-sensing technologies, in particular, now enable rapid retrieval of time-critical pollution data on a large scale. Fast, low-cost sensors afford the possible networking of multiple units within a sensor grid network so that even street-level monitoring can be achieved.
Several governments across the globe are playing an active role in funding and encouraging environmental monitoring programs, thereby keeping the growth in the global market buoyant. In the U.S. market alone, some $250 billion of economic output stems from all pollution control and monitoring activities each year. Among the faster-growing segments of this business are the markets for sophisticated sensors; monitoring equipment; large-scale networks, such as satellite, GPS and remote sensing; associated networking equipment and ancillaries; and a large slate of new technologies. Globally, the markets for environmental sensors and the related subsegments account for approximately $13 billion of economic activity at present, with a projected average annual growth of 5.9% through 2019, according to a BCC Research report.
Among the key trends in the environmental sensors industry are miniaturization down to the nano scale, continuous and/or real-time sensing capabilities, wireless networked operation, rapid processing, and increased sensitivity or flexibility. Dominant trends in the sensors business include the development of more large-scale monitoring systems, such as remote sensing and satellite-based large-area sensors. Private companies are now getting into the environmental monitoring satellite business. Mobile environmental sensing systems are being increasingly tested and proposed for urban areas. Such systems are used to identify and monitor urban air pollution events, and correlations can be made between resulting data and levels of local transport or industrial activity. A new public housing estate in Singapore, to be launched in 2015 in the Punggol Northshore district, will install sensors to monitor residents’ waste disposal. The housing authority will then analyze the data collected to deploy resources for waste collection. The district will also feature other smart technologies such as intelligent car parking areas and smart lighting.

The United States 2009 Economic Recovery Act provided additional hundreds of millions of dollars for research on environmental monitoring and sensors to U.S. entities such as the EPA, the DOE, NASA and certain government labs. Advancements have been made in networking from space; with additional satellites, networked coverage of the globe’s surface is becoming ever more comprehensive. China successfully launched the Yaogan-21 remote sensing satellite from Taiyuan Satellite Launch Center in September 2014. Yaogan-21 will be used for scientific experiments, natural resource surveys, estimation of crop yield and disaster relief.

In light of the now-numerous international reports on climate change that confirm man’s considerable impact on the environment, scientists agree that more sophisticated monitoring programs are urgently needed to detect ecological changes before they become irreversible. The surging need to monitor emissions continues to fuel the need to develop more sensitive and cost-effective environmental sensors. Nanotechnology and micro-electromechanical technology improvements in sensor development, design, and production, are expected to benefit the market. Nanotechnology enables sensors to be selective in the detection of multiple analytes and enables monitoring their presence in real time. The sensors business is a very dynamic area of the economy, and thus it is a sector with huge profit-making potential if one can correctly identify future opportunities in environmental sensing.
For our relevant BCC Research report on environmental sensing and monitoring technologies, visit the following link:

http://www.bccresearch.com/market-research/instrumentation-and-sensors/environmental-sensor-markets-ias030c.html
Bookmark and Share

Tuesday, September 23, 2014

To Preserve or Not To Preserve: Future of Stem Cell Research

The banking or preservation of stem cells such as from umbilical cord blood or bone marrow has been increasing, as the potential for using stem cells in clinical applications continues to fuel speculation and expectations. Researchers have been steadily exploring various ways to use stem cells at the optimum level, what types to use and how to deliver them to the body — findings that are not only transformational, but also progressive and pragmatic.
Preliminary and promising research through clinical and experimental trials suggests that stem cells may be able to treat autoimmune diseases such as type 1 diabetes, Parkinson’s disease, brain and spinal-cord injuries, cardiovascular disease, liver disease, kidney disease, and breast cancer, among other illnesses. Either donated or stored privately, cord-blood banking is proving to be a rich source of life-saving treatment — now and in the future, as the possibilities of cord blood continue to expand.
“Initial studies suggest that stem cell therapy can be delivered safely,” said Dr. Ellen Feigal, Senior Vice President of research and development at the California Institute of Regenerative Medicine, which was awarded more than $2 billion toward stem cell research since 2006 and is enrolling patients in 10 clinical trials this year. In addition to continuing safety research, Dr. Feigel added, “Now what we want to know is will it work, and will it be better than what’s already out there?”
On the other hand, Dr. Charles Murry, Co-director of the Institute for Stem Cell and Regenerative Medicine at the University of Washington, believes that it is important to note that very few therapies beyond bone marrow transplants have been shown to be effective. Websites, newspapers and magazines advertising stem cell therapies leave the impression among the masses that such treatments are ready to use and that “the only problem is the evil physicians and government, who want to separate people from lifesaving therapies,” he said.
Scientists are now exploring direct therapies in new and innovative ways, such as reproducing and studying diseases in a dish using cells created from patients with specific ailments. Kevin Eggan, associated with the Harvard Stem Cell Institute, is using this technique to study amyotrophic lateral sclerosis, or Lou Gehrig’s disease. He began his work five years ago when he took skin cells from two women dying from the same genetic form of ALS. He turned these skin cells into stem cells and then into nerve cells, and discovered an electrical problem—the cells weren’t signaling to one another properly, which he theorized was probably causing the neural degeneration that characterizes ALS.
After replicating these nerve cells multiple times and testing various drug compounds to see which would correct the electrical signaling problem, he found a candidate drug — an existing medication approved for epilepsy — that will be tested in ALS patients as soon as the end of this year.
Increasingly, though, companies are competing with medical institutions offering stem cell harvesting and preservation services. Parents seeking to preserve stem cells for their children are turning to the harvesting and preservation of umbilical cord blood as a source of stem cells. Further, the harvesting of stem cells from adults for their future use in cellular therapeutic applications and/or tissue engineering needs has emerged as a growing facet of the stem cell market. Even further, companies are emerging that offer patients the opportunity to clone their own line of embryonic stem cells.
The real medical challenge is, however, to uncover which type of cell therapy best addresses each particular medical condition. With numerous experiments and huge amounts of money involved, scientists and researchers have yet to come up with the most cost-effective ways to deliver stem cells.
For our relevant BCC Research stem cell report, visit the following link:



Bookmark and Share

Friday, September 19, 2014

THE BIG BANG THEORY OF WEARABLE TECHNOLOGY

The impending explosion of the wearable computing market is one of the most interesting and highly anticipated developments taking place in the high-tech industry. The $3 billion wearable consumer market is intrinsically linked with the $240+ billion smartphone market. The key market driver for wearable computing is the soaring global popularity of smartphones from manufacturers including Apple, Samsung Electronics, LG Electronics, HTC, Blackberry, Nokia and Microsoft.

The burgeoning field of wearable technology is hitting the mainstream and one of the highlights of high-tech wearable devices is that they are getting smaller, faster, cheaper, and more powerful with every new product. The computing power of an Electronic Numerical Integrator And Computer or ENIAC a decade ago can now be easily fitted inside a chip in a musical greeting card. Similarly, the smartphones today are more powerful than the PCs used, say, five years ago. Now, all the capabilities of a smartphone like making calls, taking pictures, connecting to the internet, video chats, and so on, are being condensed into smartwatches—practically everything a phone or a tablet can do.

If the growing trend in the wearable computing industry is to be believed, the time may soon come when phones and tablets are a thing of the past. Google Glass is a perfect example. The product is still under development, but if everything goes as planned, consumers will soon have no need for their standard smartphone. Google Glass will be able to easily respond to verbal commands, augmented by the occasional manual interaction via controls located directly on the frame. There has even been talk about eventually including a laser-projected virtual keyboard for those times when voice just isn’t enough. With the ability to access countless sources of information in seconds and then relay them to a miniature screen situated in the upper corner of the wearer’s vision field, Google Glass makes 4G internet connectivity features seem archaic.

Motorola recently entered the ring with its Moto 360 smartwatch, which is primarily voice operated and can easily display messages and reminders on command. The result is a small, stylish accessory that serves as an assistant, calendar, and phone all at once and completely replaces the smartphone.

However, Apple's stated entry into the smartwatch arena last week with a device that won't go on sale until early 2015 raises questions: Can the company work its magic as it has in the past and convince people that they really need a smartwatch —or will this time be different? Referring to its much awaited product of the year, iWatch, Apple CEO Tim Cook said in a press release, “Apple introduced the world to several category-defining products, the Mac, iPod, iPhone and iPad. And once again Apple is poised to captivate the world with a revolutionary product that can enrich people's lives. It's the most personal product we've ever made.”

In fact, the “wearable category” covers almost everything from Fitbit's $99 Flex fitness tracker and Nike's $99 Fuelband fitness monitors to Samsung's $199 Galaxy Gear smartwatch. In January 2014, Washington-based Innovega revealed its latest effort in introducing a wearable computer in the form of contact lenses at the CES trade show held in Las Vegas, USA—iOptik. Synchronizing its operations with the human eye, the iOptik uses its lenses to project an image of apps and information through the wearer’s pupil and onto the back of the retina. The lenses superimpose one upon the other to produce an image overlaid with information. The product is yet to be given approval by the US Food and Drug Administration (FDA); however, the company plans to schedule further operations later this year or early next year.

Wearable computing concept is evolving to be even more personal, and not just for the benefit of the wearer. Expectant mothers, in the near future, will wear electronic “tattoos”—smartsensing stickers that will monitor fetal heart rate and brain waves, detect early signs of labor, and even notify the doctor directly when it’s time to go to the hospital.

Wearable computing devices have potential benefits for any situation where information or communication is desired, and the use of a hands-free interface is considered beneficial or essential. In addition to consumer products, many industry-specific applications in markets such as defense, healthcare, manufacturing and mining are also emerging.

The growth of the consumer market for wearables largely depends on how rapidly existing smartphone users will adopt wearable accessories and alternative devices. With new and improved innovation hitting the global market every day, only time will reveal whether wearables will ultimately replace smartphone technology in many consumer environments.

For our relevant report on wearable technology, visit the following link:
http://www.bccresearch.com/market-research/information-technology/wearable-computing-ift107a.html
Bookmark and Share

Wednesday, September 10, 2014

The Future of Multi-Touch Technology is Right Here, Right Now

Touch screen-based interactivity has rapidly progressed from being a desired feature to an almost mandatory requirement for displays utilized in various types of equipment. Vending machines, home appliances, vehicle control consoles and industrial instruments increasingly feature a touch screen. The evolution of human-machine interfaces (HMIs) and computer interfaces (HCIs) is proceeding with simple button on/off controls giving way to advanced gesture-based screen interaction requiring so-called multi-touch operation.
The multi-touch technology revolution essentially began in the year 1982 when the Input Research Group at the University of Toronto, Canada, developed the first human input multi-touch system. Frosted glass panel was used with a camera placed behind the glass. As a result, when a finger or several fingers touched the glass on the otherwise white background, the camera would detect it as an action, thereby registering it as an input. Additionally, the system was pressure sensitive since the size of the dot depended on how hard the person was pressing the glass.

In 2005, Jefferson Han’s presentation of a low-cost, camera-based sensing technique using Fourier transform infrared spectroscopy (FTIR) truly highlighted the potential role the technology could play in developing the next generation of human/computer interfaces. Han’s system was cheap, easy to build, and was used to illustrate a range of creatively applied interaction techniques.

In 2007, Apple Inc. changed the face of consumer-electronics market with the release of iPhone, a mobile phone with a multi-touch screen as user interface. The iPhone’s interface and interaction techniques received considerable media attention, paving the way for numerous companies flooding the market with similar products since then. Later that same year, Microsoft announced their Surface multi-touch table, which had the appearance of a coffee table with an embedded interactive screen. Cameras were fitted inside the table that captured reflections of hands and objects as inputs. By employing a grid of cameras, the Surface has a sensing resolution sufficient to track objects augmented with visual markers.

At last year’s CES, 3M debuted its larger-than-life 84-inch Touch System. This “touch table” supports 4K and is currently demonstrating its abilities at Chicago’s Museum of Science. There are reports that a 100-inch version is under development. Multi-touch display technology holds great promise for future product development. By focusing on simplicity in the manufacturing process, cost efficiencies, and effectively using existing technologies, Lemur music controller came into existence— believed to be the world’s first commercial multi-touch display product to market in a time span of only three years.
Undoubtedly, multi-touch technology has reshaped the ways in which we interact with the digital world on a daily basis. As consumer technology continues to evolve, there’s no telling what the future might hold. From smartphones to tablets, multi-touch devices have become a routine part of our everyday lives. Multi-touch PC experiences are well on their way, and Ractiv’s Touch+ is one of many; launched this August. Touch+ by Ractiv enables users to utilize or any flat surface as a controller for their desktop or laptop, similar to that of an iPad or other tablet device. By utilizing the technology that detects a user’s hand movements, Touch+ effectively removes the necessity for a traditional mouse or trackpad, and simulates the experience of using a tablet or touch-screen device on a desktop or laptop.
Multi-touch technology combined with surface computing is radically transforming our relationship with computers.  Films like Minority Report, The Matrix: Revolutions, District 9, and Quantum of Solace have all included multi-touch interfacing in their predictions for the future, a future we are already beginning to experience today.  One of the most important technological advances of the past five years has been about the interface. As new and improved gadgets become capable of an ever-expanding variety of functions, consumers are equally thinking more creatively about how they interact with them. Usability is a huge priority in technology design. As a result, the world's leading technology manufacturers are investing millions of dollars into making their devices easier to control.
For our relevant report on multi-touch technology, visit the following link:

Bookmark and Share