Tuesday, November 25, 2014

Will Gene-edited Stem Cells Hold the Key to Fighting HIV/AIDS?

In a research conducted at Harvard University, a new gene-editing technique was used to create what could prove to be an effective method for blocking HIV from invading and destroying patients' immune systems. The work was led by Chad Cowan and Derrick Rossi, Associate Professors in Harvard's Department of Stem Cell and Regenerative Biology (HSCRB). 
This is the first published report of the Harvard researchers using CRISPR/Cas technology to efficiently and precisely edit clinically relevant genes out of cells collected directly from people, in this case human blood-forming stem cells and T-cells, researchers said. 

In theory, such gene-edited stem cells could be introduced into HIV patients via bone marrow transplantation—the procedure used to transplant blood stem cells into leukemia patients, to give rise to HIV-resistant immune systems. 

Human Immunodeficiency Virus or Acquired Immune Deficiency Syndrome (HIV/AIDS) is one of the most catastrophic threats to human health in the world. Improved treatment options and methods of diagnosis have helped to moderate the growth of the epidemic, presenting opportunities for companies prepared to engage actively in this field. However, the management of HIV/AIDS is still, in many respects, a very significant threat, and there is an ongoing, urgent need for promising new research as well as optimal exploitation of the treatment and diagnostic options already developed.
According to the Joint United Nations Programme on HIV and AIDS (UNAIDS), there were 35 million people across the globe living with human immunodeficiency virus (HIV) in 2012. In 2012, 1.6 million people died of HIV/AIDS, including 1.2 million AIDS-related deaths in sub-Saharan Africa.
According to a BCC Research report, in 2012 the HIV therapeutics market was worth $17.5 billion. The global market is expected to peak at $20.9 billion in 2016 and will shrink back to $19.6 billion in 2018, representing an overall compound annual growth rate (CAGR) of 1.5%. The patent expiration of leading antiretroviral drugs and the subsequent introduction of generic drugs will create cost pressures that will drive overall revenues down, resulting in suppressed market revenue growth.
Though this new approach to HIV therapy might be ready for human safety trials in less than five years, the researchers are still cautious about celebrating victory. Even if this new approach works perfectly, further developments need to be carried out before they are introduced in the global market.
For more information on a BCC Research market report about HIV therapeutics and diagnostics, visit the following link:

Bookmark and Share

Friday, November 14, 2014

Growing Prevalence of a Silent Killer— Diabetes

The number of people around the world suffering from diabetes has skyrocketed in the last two decades, from 30 million to 230 million, claiming millions of lives and severely taxing the ability of health care systems to deal with the epidemic, according to data released by the International Diabetes Federation. The demographics of the diabetes epidemic are also changing rapidly at the same time. While the growing problem of diabetes in the United States has been well documented, the federation’s data shows that 7 of the 10 countries with the highest number of diabetics are in the developing world.

Type I diabetes is an autoimmune disease; it appears in childhood, is lifelong, and currently must be treated with insulin. Type II diabetes typically appears in middle age. It is linked to obesity and therefore is more prevalent in developed countries with relatively affluent lifestyles, sedentary occupations, and dietary overindulgence. 
According to the American Diabetes Association's journal, Diabetes Care, Asia accounts for 60% of the world's diabetic population. In recent decades, Asia has undergone rapid economic development, urbanization, and transitions in nutritional status. China now has the largest number of diabetics over age 20, around 39 million people or about 2.7 percent of the adult population, according to the federation. The group also mentions India with the second largest number of cases with an estimated 30 million people, or about 6 percent of the adult population.
There are many factors driving the growth in diabetes worldwide, but most experts agree that changes in lifestyle and diet are the chief culprits, in addition to genetic predisposition. As developing countries rapidly industrialize, people tend to do work involving less physical activity. At the same time, the availability of food that is cheap but high in calories becomes more common.

Typically, type II diabetes occurs after a person becomes obese, when insulin resistance occurs;  the diabetes comes next. When this occurs, the cells do not respond properly to insulin; glucose does not enter the cells and blood glucose (sugar) levels rise. When fat is stored in the "wrong" places (blood vessels, heart and muscles) in the body, insulin resistance is much more likely to occur. Experts are not sure exactly how the association works.

The most common treatment for type II diabetes today involves initially placing the patient on a special diet; sometimes they may need to take pills that increase insulin secretion and also make the cells more sensitive to insulin. Occasionally they are given tablets to bring down the production of glucose. However, after a few years, for about one-third of all patients these treatments gradually lose their efficacy and insulin injections are needed. 

The most effective treatment today, however, to prevent type II diabetes onset among very obese patients is bariatric surgery.

World-renowned British specialist Dr David Cavan, Director of policy at the International Diabetes Federation, hands patients a lifeline with a simple regime that can reduce the devastating effects of type II diabetes. His plan includes adopting a healthy diet, getting support from your family, boosting exercise, assessing current diabetes drugs, keeping up to date with monitoring the condition – and, finally, being realistic about what you want to achieve.
According to Dr Cadan, people with type II diabetes will be motivated to change their lifestyle if they realize that it is possible to become free from diabetes rather than if they think that whatever they do, they will always have it. He added that reducing sugar and understanding that some starchy carbohydrates have almost the same effect as eating sugar can bring about swift changes.
According to the International Diabetes Federation and other major professional organizations, the global population of individuals with diabetes (type I and II) was about 240 million in 2010, and is expected to rise to 300 million by 2025.  The corresponding market of products used to diagnose and treat diabetes was $118.7 billion for 2012, and is expected to rise to almost $157 billion over the next five years.  The market for monitoring equipment stands at approximately $14 billion and is set to rise toward $21 billion by 2017.

Working towards introducing innovative solutions in this area, French pharmaceutical company, Servier, is planning to pioneer a tiny drug-loaded implantable pump, developed by a Boston-based start-up, Intarcia Therapeutics Inc., which is anticipated to transform the global market for patients with diabetes.

Servier has agreed to pay Intarcia Therapeutics Inc. $171 million up front, with potential additional payments that could increase the total to more than $1 billion, for rights to co-develop the device for most markets outside the U.S., the companies said. Closely held Intarcia retains full rights to the treatment for the U.S. and Japan. The pump hasn’t yet been approved for sale; the companies plan to submit it to regulators in the first half of 2016.

In Sweden, the researchers from Stockholm University say that they have uncovered a new mechanism that encourages glucose uptake in brown fat. They explain that brown fat's main function is to create heat by burning fat and sugar. By using this new knowledge, the researchers say they may be able to stimulate this signalling pathway with drugs, lowering blood sugar levels and potentially even curing type II diabetes.
The brown fat is active in adults, acting as one of the bodily tissues that can be encouraged to take up large amounts of glucose from the bloodstream to use as a fuel source to create body heat, the researchers said. As such, increasing the uptake of glucose in brown fat can quickly decrease blood sugar levels, they added.
In a person with the condition, the body's tissues are unable to respond to insulin, rendering them unable to take up sugar from the blood. Because insulin is released after eating to regulate blood sugar, when the insulin signal no longer functions properly, blood sugar levels rise. Very high blood sugar levels are dangerous to organs in the body and can lead to heart disease, kidney failure, blindness, peripheral nervous system damage, amputations and even early death.
"This is completely new and groundbreaking research," Prof. Tore Bengtsson of Stockholm University's Department of Molecular Biosciences said.
On December 20, 2006, the United Nations (UN) passed a resolution to designate November 14 as World Diabetes Day. The occasion aimed to raise awareness of diabetes, its prevention and complications and the care that people with the condition need. World Diabetes Day was first commemorated on November 14, 2007, and is observed annually.
For BCC Research market reports on diabetes, visit the following links:
Bookmark and Share

Tuesday, November 11, 2014

The Impact of DNA Vaccines on the Biotechnology Industry

Vaccines are considered a standard preventive treatment in many clinical situations today. They work by inducing an immune response against an inert pathogen to protect against future infection.
A new type of vaccine – DNA vaccines – provide an alternative method to produce immunity in organisms. First developed during the 1990s, DNA vaccines use genetically engineered DNA to produce an immune response. They work by causing the body to translate the injected DNA sequences into pathogenic proteins. The body then creates antibodies specific to the proteins, which creates immunity without causing infection. This is important for immune-compromised patients, including those infected by HIV; conventional vaccines can potentially trigger an actual infection in weakened immune systems. 
Though currently still in the experimental stages, DNA vaccines have several advantages over conventional vaccines. Conventional vaccines cover only a small number of diseases, but DNA vaccines are relatively easy to design for a range of difficult pathogens. DNA vaccines will target a wide range of diseases, such as cancers and allergies, as well as infectious diseases. Studies over the past decade suggest that DNA vaccines can be used for immunity against infections and diseases such as HIV/AIDS and malaria that kill millions worldwide every year.
DNA vaccines are also easier to distribute than traditional vaccines because they are more stable, avoid the risk of accidental infection by the pathogen, and require no refrigeration. Conventional vaccines can potentially become inert when stored in improper environments, while DNA vaccines are less susceptible to damage due to environmental conditions, such as extreme temperatures or humidity. They can be administered safely to people who live in areas where regular vaccines are difficult to maintain or may be compromised due to the lack of proper storage facilities.
DNA vaccines, if integrated into the body appropriately, can produce a sustained immune response, making booster vaccinations unnecessary. After receiving a single DNA vaccine, an individual can have lifelong immunity to a disease, decreasing the need (and cost) for booster shots.
In addition to the general medical benefits, DNA vaccines can provide a large economic benefit. Due to the decreased restrictions in the production and storage of DNA vaccines compared to regular vaccines, the cost of producing and maintaining DNA vaccines is much lower. This can be especially beneficial to people in developing countries. According to certain case studies, the cost of developing and manufacturing a successful and beneficial conventional vaccine can range from $500 million to $1 billion. Comparatively, the development and manufacturing of a DNA vaccine ranges between $200 and $300 million.
Currently there is limited knowledge of the effects of DNA vaccines on humans, since most tests have been conducted only on lab animals. Potential side effects could include chronic inflammation, because the vaccine continuously stimulates the immune system to produce antibodies. Other concerns include the possible integration of plasmid DNA into the body’s host genome, resulting in mutations, problems with DNA replication, triggering of autoimmune responses, and activation of cancer-causing genes.
A 2014 market research report published by BCC Research forecasts the global market for DNA vaccines will grow from $305.3 million in 2014 to $2.7 billion in 2019, yielding an impressive compound annual growth rate (CAGR) of 54.8%. While research tools and animal health applications currently comprise the commercialized market, human clinical DNA vaccines will make up the vast majority of this market by 2019.
In the age of genomics where DNA can be sequenced and created more quickly, accurately, and cheaper than ever before, and where safety and handling live pathogens is fraught with risk and difficulty, further research on DNA vaccines is surely a worthwhile pursuit when addressing modern food security, animal health and perhaps even human healthcare challenges.
For our market research report on DNA vaccines, visit the following link:

Bookmark and Share

Monday, October 20, 2014

Transforming the Future of Lighting Systems with Led Technology

Although the origin of Light Emitting Diode or LED technology can be traced back to 1927, it did not enter the commercial arena until much later. This was largely due to its high production cost; it is, however, rapidly gaining ground in more recent times. With the increasing demand for greener, more energy-efficient products, as well as the worldwide environmental strain on energy resources, is LED technology the answer to our lighting needs? What does the future hold for LED technology and does it have what it takes to overpower the traditional light bulb, which was perhaps the most lifechanging invention in human history?
LED lights are likely the most environmentally friendly lighting options available in the current scenario. So what makes it the leading choice in industrial, architectural, and horticultural applications around the world? LEDs last as much as 20 times longer than other lighting sources, and therefore don’t need to be replaced as often. This reduces the impact of manufacturing, packaging, and shipping. LEDs are also designed to provide more than a decade of near maintenance-free service. Less servicing also reduces environmental impact.
Additionally, LEDs consume much less energy than incandescent and high-intensity discharge (HID) lights. LED lights use only 2–17 watts of electricity, which is 25%–80% less energy than standard lighting systems. And while compact fluorescent lights are also energy-efficient, LEDs burn even less energy. LEDs contain no mercury, unlike their HID counterparts, whose mercury-laden remnants can seep into the water supply and adversely affect sea life, and those who consume it. 

According to the U.S. Department of Energy, “Widespread use of LED lighting has the greatest potential impact on energy savings in the United States. By 2027, widespread use of LEDs could save about 348 terawatt hours (compared to no LED use) of electricity: This is the equivalent annual electrical output of 44 large electric power plants (1000 megawatts each), and a total savings of more than $30 billion at today’s electricity prices.”

The electrical maintenance required for lighting systems in public buildings that receive harsh and prolonged use, sometimes 24 hours a day, 365 days of the year, is overwhelming. In public building management, time is money, and because changing LED fixtures happens far less often than traditional lighting, public building management will have to spend less time on the ladder changing bulbs. LED lighting contributes to energy savings and sustainability by improving working conditions through deliberately directed light and by reducing the energy needed to power lighting fixtures.
A groundbreaking advancement in this area came to the forefront when Isamu Akasaki, Professor at Meijo University, Hiroshi Amano, Professor at Nagoya University, and Shuki Nakahmura, a Japanese-born Professor currently at the University of California, Santa Barbara won the Nobel Prize in Physics earlier this month for inventing the world’s first blue light-emitting diodes (LEDs).
While red and green LEDs had been around for some time, the elusive blue LED represented a long-standing challenge for researchers in both academia and industry. Without this critical last piece, scientists were unable to produce white light from LEDs, as all three colors needed to be mixed together for this to happen.
The white LED lamps that resulted from this invention emit very bright white light and are superior in terms of energy efficiency and lifespan when compared with incandescent and fluorescent bulbs. LEDs can last some 100,000 hours, whereas incandescent bulbs typically last only about 1,000. “With 20% of the world’s electricity used for lighting, it’s been calculated that optimal use of LED lighting could reduce this to 4%,” said Dr. Frances Saunders, President of the Institute of Physics. “This is physics research that is having a direct impact on the grandest of scales, helping protect our environment, as well as turning up in our everyday electronic gadgets.”
What’s more, these LED lamps have the potential to improve the quality of life for more than 1.5 billion people in the world that do not have access to electricity grids. Since LEDs require very little energy input, they can run on cheap local solar power. 
Steven DenBaars, a research scientist at UC Santa Barbara, has been working on LED lights for 20 years. In his laboratory, he is already onto the next big thing: Replacing a substantial portion of indoor lights, and the archaic bulb and socket infrastructure on which they depend, with lasers.
According to DenBaars, the working of lasers is very similar to an LED lightbulb. “It’s the same materials, but you put two mirrors on either side of the LED and it breaks into a laser. Once you get reflection back and forth, you get an amplification effect, and it goes from regular emission to stimulated emission.”
Simply replacing the light emitting diodes in a typical LED bulb with a laser diode wouldn’t work. This hypothetical laser light bulb would catch on fire from all the waste heat it would generate, not to mention an ungodly amount of light, more than enough to blind anyone who looked at it. Rather, DenBaars imagines using just a handful of tiny but powerful lasers, and then redirecting their light into fiber-optic cables and other types of light-transmitting plastic that could take that light and evenly distribute it into a warm, diffuse glow.

BMW’s “hybrid supercar,” the i8, uses headlights that are the latest example of laser-based lighting technology. Like all lasers re-appropriated for conventional illumination, blue laser diodes were aimed at a phosphor that transforms the blue laser light into more diffused white light. The result is headlights with such a long working life that they could “easily outlive the automobile” in which they’re installed, notes IEEE Spectrum.
Laser lights could solve the problem of how to bridge the gap between traditional light sockets and more radical configurations of new lighting technologies. With just a few point sources of laser light installed in a building, their illumination can be redirected throughout a structure via plastic fiber-optic cables that could run along ceilings and around corners, just as the cable company runs its wires into buildings and through rooms without having to tear holes in walls or interface with the electrical system of a building. “Rather than route the electricity to the bulb you can route the light to the sources. LEDs let you do that too, but lasers would take it a couple steps further,” says DenBaars.
LEDs are helping change the way we light up our world, facilitating the development of environmentally friendly, energy-efficient light sources that offer a dramatic improvement from the incandescent bulbs pioneered at the beginning of the 20th century.

For our relevant BCC Research report on LED, visit the following link:
Bookmark and Share

Monday, October 13, 2014

Minding Our Brain’s Business: Unravelling The Neuroscience Of The Human Brain

For centuries, the mysteries of grey matter have baffled scientists and researchers alike. How can humans manage to store countless moments, past and present, in one single organ? How can animals map their path back and forth? How do we figure out a shortcut to work when there's a big traffic jam? How do we recall where we parked our car? So on and so forth.
The brain, as it turns out, has a GPS-like function that enables people to produce mental maps and navigate the world — a discovery for which husband-and-wife scientists Edvard Moser and May-Britt Moser of Norway, and New York-born researcher John O'Keefe were recently honored for breakthroughs in experiments on rats that could help pave the way for a better understanding of human diseases such as Alzheimer's. This solves the problem of how the brain creates a map of the space surrounding us and how we navigate our way through a complex environment. In other words, it reveals brain's internal positioning system and gives clues to how strokes and Alzheimer's affect the brain.
"We can actually begin to investigate what goes wrong" in Alzheimer's, said O'Keefe. "The findings might also help scientists design tests that can pick up the very earliest signs of the mind-robbing disease, whose victims lose their spatial memory and get easily lost," he added.
It was in London in 1971 where O'Keefe, conducting his research on rats, discovered the first component of the brain's positioning system. O'Keefe, now director at the center in neural circuits and behavior at University College London, found that a type of nerve cell in a brain region called the hippocampus was always activated when a rat was in a certain place in a room. Other nerve cells were activated when the rat was in other positions. O'Keefe, thereafter, concluded that these "place cells" were building up a map, not just registering visual input.
"I made the initial discovery over 40 years ago. It was met then with a lot of scepticism," the 74-year-old O'Keefe said. "And then slowly over years, the evidence accumulated. And I think it's a sign of recognition not only for myself and the work I did, but for the way in which the field has bloomed."
What is vital, however, is that the knowledge about the brain's positioning system can also help understand what causes loss of spatial awareness in stroke patients or those with brain diseases like dementia, of which Alzheimer's is the most common form and which affects 44 million people worldwide.
In 1996, Edvard Moser and May-Britt Moser, now based in scientific institutes in the Norwegian town of Trondheim, worked with O'Keefe to learn how to record the activity of cells in the hippocampus. In 2005, they identified another type of nerve cell in the entorhinal cortex region in the brains of rats that functions as a navigation system. These so-called "grid cells," they discovered, are constantly working to create a map of the outside world and are responsible for animals' knowing where they are, where they have been, and where they are going.
The Nobel Assembly said the laureates' discoveries marked a shift in scientists' understanding of how specialized cells work together to perform complex cognitive tasks. They have also opened new avenues for understanding cognitive functions such as memory, thinking, and planning.
The finding, a fundamental piece of research, explains how the brain works but does not have immediate implications for new medicines, since it does not set out a mechanism of action.
For our relevant BCC Research reports on Alzheimer’s, visit the following links:

Bookmark and Share

Friday, October 10, 2014

The Changing Face of the Healthcare Industry with Mhealth Technologies

The health industry is increasingly responding to the rising popularity and availability of technological innovations, such as tablets and smartphones. The use of connected devices to collect patient data, monitor ongoing conditions, access health information, and communicate with providers, patients, and peers is a trend that is spreading like wildfire. Health applications have the potential to be adapted and used by healthcare professionals and consumers, helping to revolutionize the sector and reflect the digital age we live in.
Mobile Health (mHealth) can provide cost-effective solutions within the global healthcare environment, which faces budget constraints, an increasing prevalence of chronic conditions, and a limited healthcare workforce. mHealth is the use of mobile and wireless technologies to support healthcare systems and achieve healthcare objectives.  After several successful global trials, several mHealth services have entered the commercialization phase and many mobile applications have been launched, stimulating partnerships with software developers, mobile operators, governmental and non-governmental organizations, and leading healthcare players.
Over the next decade innovations within the mHealth market will be driven by evolution of smartphone technologies, improvements in wireless coverage, and remote treatment and monitoring of prevalent chronic diseases. According to a BCC Research report, the global mHealth market reached nearly $1.5 billion in 2012 and $2.4 billion in 2013. It is expected to reach $21.5 billion in 2018 with a compound annual growth rate (CAGR) of 54.9% over the five-year period from 2013 to 2018.
For healthcare professionals, mobile or tablet apps also have enormous potential for training and professional development. Connectivity is built in, facilitating a blended learning platform with easily updatable information, in an accessible format. This allows for a truly flexible and enjoyable teaching and learning experience, ideal for both professionals and students, with information available anytime, anywhere.
Not only do health training and development apps provide more dynamic training tools, but they can also bring huge cost savings. Apps are inexpensive to produce and update, especially when compared to other training tools. Tablets and smartphones are readily available and the technology is relatively low cost when compared to other health technologies and professional training tools.
Apple’s new software, HealthKit, is designed to collect data from various health and fitness apps, making that data easily available to Apple users through the company’s new Health app. HealthKit is being developed to send data directly into hospital and doctors' charts, too.
Craig Federighi, Senior Vice President at Apple, at a conference held earlier this year, said, “Developers have created a vast array of healthcare devices and accompanying applications, everything from monitoring your activity level, to your heart rate, to your weight, and chronic medical conditions like high blood pressure and diabetes. ... [But] you can’t get a single comprehensive picture of your health situation. But now you can, with HealthKit.”
Mobile phone carriers such as Verizon and Sprint are also using their vast and trusted networks to bring mobile patient engagement and data to the forefront. “[Verizon’s] Converged Health Management is a perfect example of how we are using our unique combination of assets like our 4G LTE wireless network and cloud infrastructure to deliver an innovative, cost-effective and game-changing solution to the marketplace,” said John Stratton, President of Verizon Enterprise Solutions.
For relevant BCC Research reports on telemedicine technologies, visit the following links:
http://www.bccresearch.com/market-research/healthcare/telemedicine-technologies-report-hlc014g.html
Bookmark and Share

Wednesday, October 8, 2014

MISSION SAVE EARTH: EMERGING TRENDS IN ENVIRONMENTAL SENSING AND MONITORING TECHNOLOGIES

Environmental field monitoring technologies have advanced rapidly in the last decade, concurrent with advances in digital technology, computational power and Internet-enabled communications. Environmental sensors have become much smaller, faster and often less expensive. Advances in air-sensing technologies, in particular, now enable rapid retrieval of time-critical pollution data on a large scale. Fast, low-cost sensors afford the possible networking of multiple units within a sensor grid network so that even street-level monitoring can be achieved.
Several governments across the globe are playing an active role in funding and encouraging environmental monitoring programs, thereby keeping the growth in the global market buoyant. In the U.S. market alone, some $250 billion of economic output stems from all pollution control and monitoring activities each year. Among the faster-growing segments of this business are the markets for sophisticated sensors; monitoring equipment; large-scale networks, such as satellite, GPS and remote sensing; associated networking equipment and ancillaries; and a large slate of new technologies. Globally, the markets for environmental sensors and the related subsegments account for approximately $13 billion of economic activity at present, with a projected average annual growth of 5.9% through 2019, according to a BCC Research report.
Among the key trends in the environmental sensors industry are miniaturization down to the nano scale, continuous and/or real-time sensing capabilities, wireless networked operation, rapid processing, and increased sensitivity or flexibility. Dominant trends in the sensors business include the development of more large-scale monitoring systems, such as remote sensing and satellite-based large-area sensors. Private companies are now getting into the environmental monitoring satellite business. Mobile environmental sensing systems are being increasingly tested and proposed for urban areas. Such systems are used to identify and monitor urban air pollution events, and correlations can be made between resulting data and levels of local transport or industrial activity. A new public housing estate in Singapore, to be launched in 2015 in the Punggol Northshore district, will install sensors to monitor residents’ waste disposal. The housing authority will then analyze the data collected to deploy resources for waste collection. The district will also feature other smart technologies such as intelligent car parking areas and smart lighting.

The United States 2009 Economic Recovery Act provided additional hundreds of millions of dollars for research on environmental monitoring and sensors to U.S. entities such as the EPA, the DOE, NASA and certain government labs. Advancements have been made in networking from space; with additional satellites, networked coverage of the globe’s surface is becoming ever more comprehensive. China successfully launched the Yaogan-21 remote sensing satellite from Taiyuan Satellite Launch Center in September 2014. Yaogan-21 will be used for scientific experiments, natural resource surveys, estimation of crop yield and disaster relief.

In light of the now-numerous international reports on climate change that confirm man’s considerable impact on the environment, scientists agree that more sophisticated monitoring programs are urgently needed to detect ecological changes before they become irreversible. The surging need to monitor emissions continues to fuel the need to develop more sensitive and cost-effective environmental sensors. Nanotechnology and micro-electromechanical technology improvements in sensor development, design, and production, are expected to benefit the market. Nanotechnology enables sensors to be selective in the detection of multiple analytes and enables monitoring their presence in real time. The sensors business is a very dynamic area of the economy, and thus it is a sector with huge profit-making potential if one can correctly identify future opportunities in environmental sensing.
For our relevant BCC Research report on environmental sensing and monitoring technologies, visit the following link:

http://www.bccresearch.com/market-research/instrumentation-and-sensors/environmental-sensor-markets-ias030c.html
Bookmark and Share

Tuesday, September 23, 2014

To Preserve or Not To Preserve: Future of Stem Cell Research

The banking or preservation of stem cells such as from umbilical cord blood or bone marrow has been increasing, as the potential for using stem cells in clinical applications continues to fuel speculation and expectations. Researchers have been steadily exploring various ways to use stem cells at the optimum level, what types to use and how to deliver them to the body — findings that are not only transformational, but also progressive and pragmatic.
Preliminary and promising research through clinical and experimental trials suggests that stem cells may be able to treat autoimmune diseases such as type 1 diabetes, Parkinson’s disease, brain and spinal-cord injuries, cardiovascular disease, liver disease, kidney disease, and breast cancer, among other illnesses. Either donated or stored privately, cord-blood banking is proving to be a rich source of life-saving treatment — now and in the future, as the possibilities of cord blood continue to expand.
“Initial studies suggest that stem cell therapy can be delivered safely,” said Dr. Ellen Feigal, Senior Vice President of research and development at the California Institute of Regenerative Medicine, which was awarded more than $2 billion toward stem cell research since 2006 and is enrolling patients in 10 clinical trials this year. In addition to continuing safety research, Dr. Feigel added, “Now what we want to know is will it work, and will it be better than what’s already out there?”
On the other hand, Dr. Charles Murry, Co-director of the Institute for Stem Cell and Regenerative Medicine at the University of Washington, believes that it is important to note that very few therapies beyond bone marrow transplants have been shown to be effective. Websites, newspapers and magazines advertising stem cell therapies leave the impression among the masses that such treatments are ready to use and that “the only problem is the evil physicians and government, who want to separate people from lifesaving therapies,” he said.
Scientists are now exploring direct therapies in new and innovative ways, such as reproducing and studying diseases in a dish using cells created from patients with specific ailments. Kevin Eggan, associated with the Harvard Stem Cell Institute, is using this technique to study amyotrophic lateral sclerosis, or Lou Gehrig’s disease. He began his work five years ago when he took skin cells from two women dying from the same genetic form of ALS. He turned these skin cells into stem cells and then into nerve cells, and discovered an electrical problem—the cells weren’t signaling to one another properly, which he theorized was probably causing the neural degeneration that characterizes ALS.
After replicating these nerve cells multiple times and testing various drug compounds to see which would correct the electrical signaling problem, he found a candidate drug — an existing medication approved for epilepsy — that will be tested in ALS patients as soon as the end of this year.
Increasingly, though, companies are competing with medical institutions offering stem cell harvesting and preservation services. Parents seeking to preserve stem cells for their children are turning to the harvesting and preservation of umbilical cord blood as a source of stem cells. Further, the harvesting of stem cells from adults for their future use in cellular therapeutic applications and/or tissue engineering needs has emerged as a growing facet of the stem cell market. Even further, companies are emerging that offer patients the opportunity to clone their own line of embryonic stem cells.
The real medical challenge is, however, to uncover which type of cell therapy best addresses each particular medical condition. With numerous experiments and huge amounts of money involved, scientists and researchers have yet to come up with the most cost-effective ways to deliver stem cells.
For our relevant BCC Research stem cell report, visit the following link:



Bookmark and Share

Friday, September 19, 2014

THE BIG BANG THEORY OF WEARABLE TECHNOLOGY

The impending explosion of the wearable computing market is one of the most interesting and highly anticipated developments taking place in the high-tech industry. The $3 billion wearable consumer market is intrinsically linked with the $240+ billion smartphone market. The key market driver for wearable computing is the soaring global popularity of smartphones from manufacturers including Apple, Samsung Electronics, LG Electronics, HTC, Blackberry, Nokia and Microsoft.

The burgeoning field of wearable technology is hitting the mainstream and one of the highlights of high-tech wearable devices is that they are getting smaller, faster, cheaper, and more powerful with every new product. The computing power of an Electronic Numerical Integrator And Computer or ENIAC a decade ago can now be easily fitted inside a chip in a musical greeting card. Similarly, the smartphones today are more powerful than the PCs used, say, five years ago. Now, all the capabilities of a smartphone like making calls, taking pictures, connecting to the internet, video chats, and so on, are being condensed into smartwatches—practically everything a phone or a tablet can do.

If the growing trend in the wearable computing industry is to be believed, the time may soon come when phones and tablets are a thing of the past. Google Glass is a perfect example. The product is still under development, but if everything goes as planned, consumers will soon have no need for their standard smartphone. Google Glass will be able to easily respond to verbal commands, augmented by the occasional manual interaction via controls located directly on the frame. There has even been talk about eventually including a laser-projected virtual keyboard for those times when voice just isn’t enough. With the ability to access countless sources of information in seconds and then relay them to a miniature screen situated in the upper corner of the wearer’s vision field, Google Glass makes 4G internet connectivity features seem archaic.

Motorola recently entered the ring with its Moto 360 smartwatch, which is primarily voice operated and can easily display messages and reminders on command. The result is a small, stylish accessory that serves as an assistant, calendar, and phone all at once and completely replaces the smartphone.

However, Apple's stated entry into the smartwatch arena last week with a device that won't go on sale until early 2015 raises questions: Can the company work its magic as it has in the past and convince people that they really need a smartwatch —or will this time be different? Referring to its much awaited product of the year, iWatch, Apple CEO Tim Cook said in a press release, “Apple introduced the world to several category-defining products, the Mac, iPod, iPhone and iPad. And once again Apple is poised to captivate the world with a revolutionary product that can enrich people's lives. It's the most personal product we've ever made.”

In fact, the “wearable category” covers almost everything from Fitbit's $99 Flex fitness tracker and Nike's $99 Fuelband fitness monitors to Samsung's $199 Galaxy Gear smartwatch. In January 2014, Washington-based Innovega revealed its latest effort in introducing a wearable computer in the form of contact lenses at the CES trade show held in Las Vegas, USA—iOptik. Synchronizing its operations with the human eye, the iOptik uses its lenses to project an image of apps and information through the wearer’s pupil and onto the back of the retina. The lenses superimpose one upon the other to produce an image overlaid with information. The product is yet to be given approval by the US Food and Drug Administration (FDA); however, the company plans to schedule further operations later this year or early next year.

Wearable computing concept is evolving to be even more personal, and not just for the benefit of the wearer. Expectant mothers, in the near future, will wear electronic “tattoos”—smartsensing stickers that will monitor fetal heart rate and brain waves, detect early signs of labor, and even notify the doctor directly when it’s time to go to the hospital.

Wearable computing devices have potential benefits for any situation where information or communication is desired, and the use of a hands-free interface is considered beneficial or essential. In addition to consumer products, many industry-specific applications in markets such as defense, healthcare, manufacturing and mining are also emerging.

The growth of the consumer market for wearables largely depends on how rapidly existing smartphone users will adopt wearable accessories and alternative devices. With new and improved innovation hitting the global market every day, only time will reveal whether wearables will ultimately replace smartphone technology in many consumer environments.

For our relevant report on wearable technology, visit the following link:
http://www.bccresearch.com/market-research/information-technology/wearable-computing-ift107a.html
Bookmark and Share

Wednesday, September 10, 2014

The Future of Multi-Touch Technology is Right Here, Right Now

Touch screen-based interactivity has rapidly progressed from being a desired feature to an almost mandatory requirement for displays utilized in various types of equipment. Vending machines, home appliances, vehicle control consoles and industrial instruments increasingly feature a touch screen. The evolution of human-machine interfaces (HMIs) and computer interfaces (HCIs) is proceeding with simple button on/off controls giving way to advanced gesture-based screen interaction requiring so-called multi-touch operation.
The multi-touch technology revolution essentially began in the year 1982 when the Input Research Group at the University of Toronto, Canada, developed the first human input multi-touch system. Frosted glass panel was used with a camera placed behind the glass. As a result, when a finger or several fingers touched the glass on the otherwise white background, the camera would detect it as an action, thereby registering it as an input. Additionally, the system was pressure sensitive since the size of the dot depended on how hard the person was pressing the glass.

In 2005, Jefferson Han’s presentation of a low-cost, camera-based sensing technique using Fourier transform infrared spectroscopy (FTIR) truly highlighted the potential role the technology could play in developing the next generation of human/computer interfaces. Han’s system was cheap, easy to build, and was used to illustrate a range of creatively applied interaction techniques.

In 2007, Apple Inc. changed the face of consumer-electronics market with the release of iPhone, a mobile phone with a multi-touch screen as user interface. The iPhone’s interface and interaction techniques received considerable media attention, paving the way for numerous companies flooding the market with similar products since then. Later that same year, Microsoft announced their Surface multi-touch table, which had the appearance of a coffee table with an embedded interactive screen. Cameras were fitted inside the table that captured reflections of hands and objects as inputs. By employing a grid of cameras, the Surface has a sensing resolution sufficient to track objects augmented with visual markers.

At last year’s CES, 3M debuted its larger-than-life 84-inch Touch System. This “touch table” supports 4K and is currently demonstrating its abilities at Chicago’s Museum of Science. There are reports that a 100-inch version is under development. Multi-touch display technology holds great promise for future product development. By focusing on simplicity in the manufacturing process, cost efficiencies, and effectively using existing technologies, Lemur music controller came into existence— believed to be the world’s first commercial multi-touch display product to market in a time span of only three years.
Undoubtedly, multi-touch technology has reshaped the ways in which we interact with the digital world on a daily basis. As consumer technology continues to evolve, there’s no telling what the future might hold. From smartphones to tablets, multi-touch devices have become a routine part of our everyday lives. Multi-touch PC experiences are well on their way, and Ractiv’s Touch+ is one of many; launched this August. Touch+ by Ractiv enables users to utilize or any flat surface as a controller for their desktop or laptop, similar to that of an iPad or other tablet device. By utilizing the technology that detects a user’s hand movements, Touch+ effectively removes the necessity for a traditional mouse or trackpad, and simulates the experience of using a tablet or touch-screen device on a desktop or laptop.
Multi-touch technology combined with surface computing is radically transforming our relationship with computers.  Films like Minority Report, The Matrix: Revolutions, District 9, and Quantum of Solace have all included multi-touch interfacing in their predictions for the future, a future we are already beginning to experience today.  One of the most important technological advances of the past five years has been about the interface. As new and improved gadgets become capable of an ever-expanding variety of functions, consumers are equally thinking more creatively about how they interact with them. Usability is a huge priority in technology design. As a result, the world's leading technology manufacturers are investing millions of dollars into making their devices easier to control.
For our relevant report on multi-touch technology, visit the following link:

Bookmark and Share

Friday, September 5, 2014

Redefining 3d Printing Technology through Innovation


Three-dimensional (3D) printing, also called additive manufacturing, is the process of making three-dimensional solids from a digital model by depositing successive layers of material in different shapes. In the last few years, this technology has taken the world of trade and commerce by storm, most notably, retail, traditional manufacturing, automotive, aviation, finance, construction, and electronics. BCC Research estimated the total 2013 global market for 3D printing materials to be worth $245 million. This figure is expected to rise to $285 million in 2014 and $650 million in 2019, a CAGR of 17.9% over the next five years.

The rate of development and the increasing popularity of 3D printing is astounding; continuously pushing its limits to the extreme. Every day new breakthroughs are being achieved, and ideas which seemed impossible only a few short years ago are becoming commonplace. To an extent that NASA has decided to take this emerging technology beyond stratosphere: into space. With the size of a small microwave, the printer is only proof of concept that printing in zero gravity can create objects that are as accurate and as strong as those produced by a printer on Earth. The objective of this project is to create a machine shop for astronauts in space. Astronauts will no longer have to be dependent on next resupply mission on gravity well; they can create the needed parts right onboard!

The printer is scheduled to be lofted into low-Earth orbit on a SpaceX-4 space shuttle this September. If all goes well with this experiment, then NASA will move on to a more elaborate next-generation printer called the Additive Manufacturing Facility later this year.

One area which will need to advance before complex portable electronics are fabricated through additive manufacturing is that of battery manufacturing. Although, 3D printing of a battery is not a new concept, a 3D printed graphene-based battery could be a game changer for several industries. According to the Vancouver based company, Graphene 3D Lab Inc., batteries which are based on the super material known as graphene, a single layer of carbon atoms, could outperform even some of the best energy storage devices on the market today. The ability to 3D print a battery allows for custom shapes to be introduced into the world of electronics where companies are trying to cram as many components into the smallest space possible.

Scientists and researchers are now turning to a field where innovation saves lives. While printing of complete organs for transplants may be decades away, experts in Monash University in Melbourne, Australia, have developed highly-realistic 3D-printed body parts to allow trainee doctors to learn human anatomy without needing access to a real cadaver.
"Our 3D printed series can be produced quickly and easily, and unlike cadavers they won't deteriorate - so they are a cost-effective option too," said Paul McMenamin, Director of the University's Centre for Human Anatomy Education.
3D printing technology is being used to manufacture a wide array of items – from auto parts and prototypes to human skin and organs. In a world where mass-manufacturing takes place on scales never seen before, 3D printing is starting to spell big changes for the way the world thinks about production. This inevitably means new frontiers in global trade will be opened as well.

For our relevant report on 3D printing, visit the following link:


References:
·        


Bookmark and Share

Friday, August 29, 2014

Ebola and Its Relation to Pharmaceutical Companies

Ebola hemorrhagic fever
Ebola hemorrhagic fever (Ebola HF) is a severe, often-fatal disease in humans and primates (monkeys, gorillas, and chimpanzees) that has appeared sporadically since it was initially identified in 1976. The virus is one of two members of a family of RNA viruses called the Filoviridae. Ebola is one of the potential bioterrorism agents now targeted by the National Institute of Allergy and Infectious Diseases (NIAID).

Ebola is one of the world's deadliest diseases, with up to 90% of cases resulting in death, although in the current outbreak the rate is about 55%. The outbreak continues to wreak havoc in West Africa, especially Guinea, Sierra Leone and Liberia, each dealing with hundreds of cases every day. According to WHO reports, the collective death toll has now risen to 2,615 as of August 22, 2014.



Until recently, two pharmaceutical companies were working on R&D and possible vaccine solutions for Ebola. In July 2010, Tekmira, a Canadian pharmaceutical company, was awarded a contract worth up to $140 million as part of the U.S. government’s Transformational Medical Technologies (TMT) program. The contract was for Tekmira to further develop TKM-Ebola, an RNAi therapeutic, to treat Ebola. Unfortunately, after facing a number of difficulties, TMK-Ebola has been put on hold as of July 2014. 

However, more prominent in the media has been ZMapp, a biomedical drug created by California-based Mapp Biopharmaceuticals. This drug was shown to result in 100% cure when administered to monkeys during pharmaceutical trials.  Kent Brantly, a 33-year-old doctor, and 59-year-old aid worker, Nancy Writebol, were two of the early patients of the drug. Both became infected during their stay in Central Africa to combat the deadly virus.

With the rising death toll in Central and West Africa, academic researchers, biotechnology specialists, and pharmaceutical leaders in Boston and elsewhere are offering tantalizing evidence that vaccines against Ebola and other killer diseases can be made faster and cheaper than previously believed.In a study accepted for publication in the journal, "Human Vaccines & Immunotherapeutics," a Boston research consortium, VaxCelerate, details how it produced a vaccine ready for animal testing against lassa fever, a hemorrhagic disease similar to Ebola, for less than $1 million in four months.

VaxCelerate, based within Massachusetts General Hospital’s vaccine center, is part of a broader push by the U.S. government to improve — even transform — vaccine development to better respond to emerging infectious diseases. The U.S. Department of Health and Human Services and the Department of Defense are working with several Boston-area companies and scientists on vaccine initiatives. Also under development are therapies that might help people who have contracted Ebola or who are likely to come into contact with it.

For our relevant reports on Ebola, visit the links cited below.

References:


Bookmark and Share