Published in the Jan. 24, 2012, Billian’s HealthDATA/Porter Research Hub e-newsletter
By Whitney L.J. Howell
When talk first started about moving healthcare providers away from paper files to electronic health records (EHRs), not everyone was convinced it was a good idea. Jane Jenning’s boss at Primary Medical Specialists in Portsmouth, Ohio, was one of them.
“Making the move to a new system seemed too complicated for a doctor nearing the end of her career,” said Jennings, Primary Medical’s office manager. “She didn’t know if she wanted to invest the time and money into learning something new.”
But that was before the office knew about the $44,000 incentive payment they could receive from the Centers for Medicare & Medicaid Services (CMS) for proving the office had fully implemented an accredited EHR and was using it efficiently. The problem, however, was finding the right one.
That’s when Ohio Health Information Partnership (OHIP), Ohio’s regional extension center (REC), took the lead. Not only did it help Primary Medical identify the best system, Jennings said, but it will also provide step-by-step guidance when the office goes live with its software of choice next month.
“From the provider’s perspective, we want to meet meaningful use criteria,” Jennings said. “So rather than the piecemeal approach to an EHR system, it’s been very beneficial for us to take advantage of the work OHIP’s already done and the information it has.”
OHIP, also known as “The Partnership,” isn’t a one-of-a-kind group. Since 2010, the U.S. Department of Health & Human Services’ Office of the National Coordinator (ONC) has invested $677 million in 62 RECs nationwide. Their goal is to guide more than 100,000 primary care providers to EHR meaningful use. The on-the-ground assistance they provide is particularly valuable now – it’s crunch time for Stage 1 of meaningful use, which requires that eligible professionals meet 20 objectives related to their use of an EHR, as set forth by the CMS. As of last fall, only a small percentage of hospitals had achieved this first stage, and healthcare providers have only until Feb. 29, 2012, to prove they’ve reached this goal in order to qualify for incentive payments.
Where RECs Are Today
While substantial, REC funding is a drop in the bucket compared to projections in federal health information technology (HIT) growth. By 2016, software company Deltek Inc. predicts federal HIT expenditures will grow to $6.5 billion. But the millions flowing to RECs has been money well spent – many are already achieving their initial enrollment goals.
For example, OHIP recently hit its 6,000-physician goal. The Chicago Health Information Technology REC enrolled nearly 1,500 doctors by the end of December 2011, and the New York eHealth Collaborative REC passed its 5,100-physician goal around the same time. Last week, the Kentucky REC also announced it had reached its initial 1,000-doctor enrollment goal.
Currently, there isn’t one model for how RECs should be organized or work. Each is designed differently, and ONC is watching to see which are successful.
“A lot of these models will fail, but the ONC is looking for the ones that work,” said Gregg Alexander, M.D., a pediatrician at Madison Pediatrics in London, Ohio. “They’re looking for the homerun hitters so they can replicate that model nationwide.”
Alexander has been involved with OHIP since its inception and is currently a member of the Board of Directors. Early on, he said, OHIP decided to help providers identify the best EHR solution for their offices by offering an EHR selection tool. Originally priced at $1,800, OHIP makes the tool, produced by medical-device manufacturer Welch Allyn, available for $50.
In a further attempt to guide providers, OHIP staff analyzed the available vendors, starred five as preferred, and negotiated 15- to -20-percent discounts off the system’s usual fees.
Early skeptics wondered whether RECs would be effective, but many on-the-ground leaders have been pleased with the progress. In fact, many say providers see RECs as a trusted advisor.
“The effectiveness of the RECs, in my opinion, is surprisingly high. I think the challenge in terms of the numbers in showing people meaningful use is not so much an efficacy of the RECs, but rather the complexity of getting people to meaningful use,” said Sean McPhillips, Kentucky’s REC project manager. “For example, some vendors may not have the immunization interface enabled to help information exchange, and that creates a huge obstacle. Or we had one vendor whose quality reporting module is not working properly for some reason, so all of their quality reports are coming up zero. That means the provider who wants to test for meaningful use right now can’t.”
Challenges RECs Face
Although the relationship between RECs and vendors has benefitted facilities and physician practices nationwide, that coordination has also been one of the biggest criticisms lobbed at the groups. Some industry leaders worry that rather than giving providers impartial advice, RECs will be little more than promotion mechanisms for certain vendors.
It’s a delicate dance for RECs and vendors to avoid this perception, said Tom S. Lee, Ph.D., CEO and founder of cloud-based EHR software provider SA Ignite. By offering support services as a third-party, vendors can add value to a REC without solely pushing its own product offerings.
“Vendors play an important role,” Lee said. “RECs can’t drive the value of EHRs to 100,000 providers alone, and vendors can provide the information technology support that the physicians will need.”
Another significant challenge is funding. Each REC currently has financial support from the ONC; however, that money will eventually evaporate. To supplement these monies, some RECs charge for their services, said McPhillips. For example, Kentucky providers who do not fall under Medicare can purchase the REC’s tool kit and six hours of consultation for $500. Additional services are available at an hourly rate.
In some ways, though, the most significant roadblock RECs face is the physicians they’re designed to help. As with Jennings’ boss, many older physicians closer to retirement resist becoming EHR savvy. However, as they leave practice, Alexander said, EHRs will organically grow to be the dominant patient record system.
Primary Medical Specialist’s Experience
From the moment Jennings contacted the closest REC to her office, located in Athens, the practice had a knowledgeable partner who could answer any EHR-related question, provide detailed information about preferred vendors, and secure a discounted price.
“After connecting with the REC, we stopped looking for a vendor independently,” Jennings said. “We had an additional layer of confidence knowing the vendor we selected had received a stamp of approval from our REC.”
Primary Medical used the Welch Allyn tool to weed through OHIP’s preferred vendors, eventually pinpointing the electronic system that best fit their needs. Throughout the selection process, whenever she or a provider had a question ranging from software to vendor contracts, their REC representative provided timely, unbiased feedback either by phone or email.
The one drawback to working with OHIP, Jennings said, has been how long it took her to find out about the organization. Knowledge of the group isn’t widespread.
“I don’t know how many doctors know that there’s help from the government for this,” she said. “But healthcare providers in general need as much help as we can get to select an electronic records vendor that is right for us.”
Primary Medical has no plans to sever its relationship with its REC after it reaches EHR proficiency. Instead, the office intends to stay connected to the REC as it faces healthcare’s next information technology challenge, the health information exchange (HIE).
What’s Next For RECs?
Guiding providers to meaningful use proficiency is a time-limited responsibility. In order to maintain relevancy, RECs leaders are already considering what the groups’ next steps might be.
For OHIP, the next big challenge will be integrating providers into the state’s HIE. But there are other opportunities available, such as coding or patient privacy, to help providers and patients span the existing knowledge gap, McPhillips said. The goal, he said, will be to foster a better-educated and empowered patient population that can more actively participate in their own healthcare.
“There’s a lot of transformation going on in healthcare today, whether it be health information technology, healthcare reform, patient-centered medical homes …” he said. “It’s a sophisticated industry in which the principle consumer, the patient, is really at a knowledge loss – so the challenge to empower the patient is health literacy. So we recognize that as a huge opportunity that is untapped.”
To read the article in its original location: http://www.porterresearch.com/Resource_Center/Blog_News/Industry_News/2012/January/Regional_Extension_Centersx_Moving_Physicians_Forward_with_Meaningful_Use.html
Published on the Jan. 24, 2012, DiagnosticImaging.com website
By Whitney L.J. Howell
Over the past decade, the field of mammography has become a paradox. Leading medical organizations disagree about the benefits of the study and the best age for a baseline exam, but the number of scans is rising. At the same time, the number of facilities and radiologists willing to read these studies is falling.
As of January 1, according to the American College of Radiology, there were 8,125 accredited mammography facilities nationwide, down from 9,400 in 2000. Many in the industry have turned to telemammography as the best way ensure patients still have access to screening and diagnostic scans. The number of companies offering telemammography is still small, but the group is growing.
“The potential for telemammography is huge — women over 40 will need to have their mammograms,” said Timothy Myers, MD, a reading radiologist with teleradiology company vRad. “The issue, however, is there just aren’t a lot of players. Teleradiology is just now coming to an age where it’s easy to transfer images.”
As part of its teleradiology services, vRad also offers telemammography.
The premise behind telemammography is the same as general teleradiology — a radiologist reads the studies at a location other than the clinical setting of service. Today, most telemammographers are compliant with the Mammography Quality Standards Act and are licensed in both their states of residence and practice. This strategy does have specific hardware requirements unique to mammography, however, including mega-pixel computers that provide a high degree of image clarity for both sending and receiving providers.
Although there is some disagreement between industry experts and practitioners about whether telemammography is equally as useful for screening and diagnostic mammograms, overall the strategy has received a warm — if slow — reception.
To read the remainder of the article: http://www.diagnosticimaging.com/teleradiology/content/article/113619/2021608
Published in the Jan. 23, 2012, Raleigh News & Observer and Charlotte Observer
By Whitney L.J. Howell
These days, downtown Durham is better known for its burgeoning cuisine scene and modernized, reclaimed spaces than its long history with cash crops. The storied days of tobacco curing in the city’s brick factories have been replaced with technology research that could, scientists say, bolster the biofuel industry while creating stronger crops.
Inside its redesigned, sustainable lab space near the Durham Performing Arts Center, GrassRoots Biotechnology uses patented research methods to study plant genes.
The goal, said company co-founder Philip Benfey, is to pinpoint ways to strengthen plants considered useful in biofuel production, such as switchgrass.
“With this research, we’re looking toward the future,” said Benfey, a genomics professor with Duke University Institute for Genome & Science Policy. “Over the past five or six years, as the price of oil has risen, there’s been increased interest in the idea of biofuels. There’s an opportunity to use the discoveries we’ve made in the academic lab in the commercial system.”
GrassRoots, launched in 2007, uses its two technology platforms – RootArray and Root Imaging – to dissect plant gene regulation, understand gene function and identify root traits.
Ultimately, the company wants to use its findings to boost crop yield and strength, Benfey said. It’s an objective that is in line with the Biofuels Center of North Carolina, a General Assembly-funded endeavor that supports biofuel production statewide.
But making better energy sources of biofuel plants like switchgrass may not be as simple as tinkering with the plant’s genetic structure. The part of the plant that makes it useful for biofuels is locked away inside harder, woody parts called lignin. N.C. State biological and agricultural engineering associate professor Ratna Sharma-Shivappa said freeing the lignin can be difficult. Often, she said, harsh chemicals must be used to separate the energy-producing carbohydrates from the rest of the plant.
Making plants glow
The main focus of GrassRoots’ scientists is creating a faster-growing, more
extensive root system in switchgrass, a crop identified by the U.S. Department of Agriculture as yielding high energy levels. Using RootArray, a system Benfey developed, scientists track how gene expression changes from one generation of plants to another, weeding out weak genes and passing along the strong ones.
GrassRoots’ long-time partner Monsanto, the global agricultural and farm production firm, is particularly interested in using the RootArray findings to produce heartier plants that can meet the world’s food and fuel needs, Benfey said.
Douglas Eisner, GrassRoots’ co-founder, said scientists put seeds into the individual holes of a multiwell tray. A promoter gene – part of the plant’s DNA responsible for passing down characteristics – is tagged with green fluorescent protein. The plants grow for up to six days in a nutrient-rich gel environment, and if the tagged gene is passed on, it glows green.
Scientists manipulate each gel environment to see how the plants respond. For example, one tray of plants could be exposed to a high-salt environment, or another to low nitrogen levels. The RootArray process is highly efficient, Eisner said, allowing GrassRoots to test and observe the expression of 100 times more genes than other current technologies.
“Understanding the unique root structure for crops is critical because nutrients are often scarce in soil,” Eisner said. “If we know how plants will respond to the soil and what their uptake of the nutrients and bacteria is, then we will know what to do so they can cope with their environments.”
Tracking gene expression will help GrassRoots identify the genes and genomic markers that correspond to various traits, presenting the opportunity to breed out weaker characteristics and only pass along traits that make plants stronger.
Such changes, he said, will allow the production of biofuel plants in arid land, places with high salt content, and soil with low nitrogen.
Studying the root
It’s also important to study the roots themselves, Benfey said. Through Root Imaging, scientists take 40 images in a 360-degree view of the roots grown in gel. Those images are used to produce three-dimensional root reconstructions that can be compared with other roots from different varieties of the same plant.
“It’s a way for us to see how various types of the same plant differ,” Benfey said. “We can ask questions about why the differences are there – what makes one variety thrive in a low phosphorus environment (when) others fail. This technique could help us even design roots that better withstand drought.”
Switchgrass is a particularly good crop for this type of study, he said, because some roots grow deep into the ground while others stay shallow but spread out extensively.
Currently, however, switchgrass takes a full growing season to produce any yield. Ultimately, Benfey said, he hopes GrassRoots’ research results in a faster-growing, faster-producing switchgrass that can grow in marginal soil conditions, such as North Carolina’s red-clay environments.
Understanding the root system and knowing what a plant looks like underground is beneficial in many ways, he said.
“We have the capacity to make plants stronger. It’s not just for biofuels, and it’s not just for agriculture. This research and its results are applicable to both areas.”
To read the article on the Raleigh News & Observer website: http://www.newsobserver.com/2012/01/23/1798936/making-biofuels-grow-glow.html
To read the article on the Charlotte Observer website: http://www.charlotteobserver.com/2012/01/23/2950470/making-biofuels-grow-glow.html#storylink=misearch
Published on the Jan. 19, 2012, DiagnosticImaging.com website
By Whitney L.J. Howell
When the CT images came through to Children’s Hospital of Boston, the attached physician’s note indicated the patient had an epidural hematoma. The diagnosis was accurate, but doctors had grossly underestimated the severity.
“When our radiologists looked at the images, they decided the epidural hematoma was much larger than the referring physicians thought,” said Richard Robertson, MD, Children’s Hospital’s radiologist-in-chief. “Rather than wasting time admitting the patient to the emergency department, we routed the child directly to the operating room.”
Providing the best care hinged on viewing the scans prior to the patient’s arrival, Robertson said, and he credited cloud image sharing with having that ability. It’s that early access to imaging studies that gives doctors a jump start in treating patients with urgent needs.
As a significant shift in practice, online image transfer eliminates the possibility that an image-containing CD will be lost when patients visit a new provider or clinical setting. If a patient forgets the CD or if it is misplaced, you face having to either postpone service or repeat scans — and that’s expensive. According to a 2008 McKinsey Global Institute report on diagnostic services, duplicated studies accounted for $26.5 billion in unnecessary healthcare costs.
But data exchange via the cloud isn’t a new idea. Financial advisory firm Merrill Lynch estimated in 2008 that cloud computing was already a $95 million industry. As a $56.5 million-subset, cloud-based image sharing is also quickly gaining popularity, and business intelligence firm GlobalData anticipates it will grow by an additional 27 percent before 2018. At the November 2011 RSNA meeting in Chicago, several vendors unveiled cloud-based systems as the technology gains ground in the imaging field.
How It Works
In most ways, the cloud is synonymous with the Internet. However, images sent via a cloud image sharing solution can only be viewed within that system rather than being freely and publicly available.
In most cases, cloud image sharing strategies closely resemble your PACS, said Florent Saint-Chair, the general manager for eMix, the cloud image sharing solution for San Diego-based vendor DR Systems. The cloud platform hovers above the PACS like a membrane, allowing you to both receive outside images and send your images to other providers .
“Cloud image sharing is a very cost-effective solution because it doesn’t require monetary investment in a lot of hardware,” Saint-Clair said of the cloud solution that has about 350 hospital customers. “It’s an excellent way to go for groups or hospitals that don’t want to solely own an image sharing solution.”
Sharing images is a one-step process for hospitals or clinics that are part of the same cloud network. For unaffiliated hospitals, it’s almost as easy, Saint-Clair said. After confirming the identities of each provider, cloud image sharing can proceed over a virtual private network (VPN). As part of this connection, the provider receiving the image uses a password sent via email to log into the cloud server and see the studies.
According to Hamid Tabatabaie, CEO of Massachusetts-based vendor lifeIMAGE, cloud image sharing can also free up space in your PACS. Instead of downloading the image into your system, you can read the image within the cloud sharing platform and delete it when you no longer need it.
“Cloud image sharing solutions are really the custodian of images that are shared between providers,” he said. “It’s quickly becoming an accepted way to handle data.”
To read the remainder of the article: http://www.diagnosticimaging.com/informatics-pacs/content/article/113619/2020231