The San Francisco-based company will make this determination by looking at a woman’s age, the ZIP code she lives in, the number of children who live in this woman’s household, the woman’s search history within the platform’s internal health resource library, what doctors the woman has visited lately, whether a woman has filled a prescription for prenatal vitamins, whether a woman has filled a prescription for birth control and other related digitized data. If the Castlight Health analytics software thinks that a woman may be pregnant, its system sends the individual relevant, personalized health information and advice within the platform.
The goal is to empower employees to make better decisions about their health, says Alka Tandon, the Castlight Action product lead. Pregnancy is just one of any number of health conditions that the Castlight Health system can predict. Others include lower back pain, diabetes, hip and knee issues that may require surgery and cardiac care.
These conditions be disruptive, expensive or even tragic, so getting personalized healthcare advice when you need it can be crucial -- such as when taking steps to prevent a future child from being born with a birth defect. But this peek into the future comes with a high price.
“Let’s be clear: There is always risk," says Kevin Johnson, CEO of the cybersecurity firm , a team of white-hat hackers paid by companies to break into their security systems. "If anybody tells you that there is no risk involved, punch them in the throat because they are lying to you."
Your doctor has your personal health information. So does your insurance company. Now, many companies give employees the option to invite a third-party analytics company into the exam room. That means more eyeballs on your chart -- and more potential cracks for that information to get out, according to Johnson.
The question here -- and with all exchanges of information in the digital realm -- is whether the risk to the individual is worth it. Will this data actually help you make better health decisions? The potential benefits are huge -- life changing -- but the risks are just as great. The danger of having your health information tampered with is greater than the danger of having almost any other information tampered with. If your credit card security is breached, you can order a new credit card. It’s annoying, Johnson says, but it’s not going to put your life in jeopardy.
In some cases, you may not become aware that your health information was tampered with until it’s too late. Johnson explains how this scenario could work. “One of the reasons that people steal health records is to qualify for insurance for themselves. So this person who has stolen my health records so that they can get healthcare is allergic to penicillin, but I am not," he says. "All of a sudden, I have an emergency and my lungs are having problems, I am sick, I go to the hospital, they pull up my medical records, and because of this bad guy, they now believe that I am allergic to penicillin, so they don’t use it even though it would be the most effective treatment.”
The business of analyzing your healthcare data.
Healthcare costs have ballooned in the U.S. Employers are hiring these outside, third-party healthcare tech companies because there is more data and computing power than ever before. The goal is both to empower individuals to be proactive in their own care and to lower overall healthcare costs for employers.
For example, Castlight Health’s software may help steer an employee away from costly back surgery to a less invasive, cheaper alternative. that breaks down options by cost. In a on Castlight Health’s website, IT infrastructure services company Pomeroy said it had been facing projections that healthcare costs for its 3,500 employees would increase by 50 percent over the next four years. It’s been able to save millions of dollars on healthcare costs since implementing Castlight’s software.
That has made healthcare tech big business. Other Castlight Health customers include Adobe, CVS Health, Nielsen and Viacom. Last year, , up 65 percent from 2014.
Another healthcare tech company, Dallas-headquartered , collects historic health data from medical and prescription claims, biometric data from fitness and activity trackers and laboratory data from national labs that handle blood screening. Other data that HealthMine uses, such as an individual’s health and exercise habits, is self-reported. All of that information is used to generate a personal health score for employees. That health score is designed to provide recommendations that people understand -- and allow them to fully comprehend diagnoses.
“If you are diagnosed with hyperlipidemia, do you even know what hyperlipidemia is? Do you know what impact that has onto your life and your lifestyle?” asks Christopher Chang, the company’s chief technology officer. HealthMine aims to make sure the answer to both of those questions is yes.
HealthMine does not disclose financial information, but companies like it are part of an exploding sector of the economy sitting at the intersection of healthcare and technology. By 2020, the global healthcare IT market is expected to reach $104.5 billion, according to a . In the coming years, investors are expected to seek out companies driving innovation around digitizing and integrating health records systems, the report predicts.
Playing offense when it comes to privacy concerns.
To be sure, these healthcare technology companies are aware of the fence they are dancing on. They understand that consumers are always living with the fear of their private medical information being compromised. “This is definitely a concern that we have addressed from the very beginning as we started to build the product," says Tandon of Castlight. "It was one that we heard a lot from folks that we interviewed before we even wrote a single line of code.”
Castlight keeps all of the data it collects about employees confidential, and its software is HIPAA compliant. “Employers that use our system never, under any circumstance, see individual employee data … the employee data is anonymized, aggregated,” says Jim Rivas, a spokesperson for Castlight, in an email with Entrepreneur.
Instead, the Castlight software is able to see the size of a group of no fewer than 40 employees who may be at-risk for a specific condition. Then, Castlight can send tailored recommendations and advice. The minimum size of the at-risk group that Castlight contacts is almost four times the , according to the for those using Medicare or Medicaid data.
To ensure these messages aren’t too invasive, each one goes through a nearly two-day consumer testing process. “If, at any point in those 40 hours, the content is flagged as creepy or insensitive or too personal, we will throw out that content and start again,” Tandon says.
At HealthMine, there are only a select few people within the company who have access to data tied to a person’s name. HealthMine also trains support staff to be hyper-vigilant in sniffing out fraudsters who try to break into technical systems by smooth-talking customer service agents. Often data breaches are the result of human error, not technical error, Chang says.
And there’s always a way for employees to opt out, although knowledge of the extent of these programs plays a large role, Johnson says. Often, new employees are largely oblivious to what they are agreeing to in the first-day blur on a new job. “This idea that you have to ‘opt in’ sounds really, really, good, but when you start to think about benefits, when you start to think about the amount of paperwork people fill out when they join a company, nobody understands everything they are opting in to,” Johnson says. “So, that isn’t really a protection.”
Transparency does go a long way, Chang says. “We explain exactly why we are gathering information, how they might benefit from this, and we try to create a habit loop: What is in it for me? What can I do about it? What do I get out of it?” he says. “You either trust or you don’t trust your employer. And if you don’t trust your employer and you believe that such data will be used against you, you simply opt out. You don’t have to participate.”
HealthMine, healthcare management firm and Castlight Health all encrypt data both at rest and in transit, each company says. That’s a bit technical-wonky, but while it’s relatively standard practice for data to be encrypted, or scrambled, while it’s traveling from one source to another, encrypting data while it is at rest is a higher level of data protection than most companies employ. Having data encrypted at rest means that even if a hacker were able to penetrate a server where data is stored, the hacker would not be able to understand or retrieve that data.
We are more willing than ever to live our lives online, but our health information is still private.
Consumers are living more and more of their lives online. Our expectations of privacy are being outweighed daily by the desire to get feedback and information. Mark Zuckerberg is partially to thank, or blame, for that.
“In the last five years, with the explosion of smartphones and especially social media, people are just so much more comfortable. LinkedIn, Facebook and Instagram, their lives are online and are shared largely publicly so their tolerance for sharing, they are more likely to share now than they were five years ago for sure,” says Derek Newell, CEO of .
Even as consumers live more and more of their lives online in a public format, they are still more hesitant to share health information than they are to post their latest vacation snapshots on Instagram. “Let’s say Facebook took people from willing to share 10 percent to willing to share 90 percent in their personal lives. We have gone from 1 percent willingness to share in the same time period to maybe 50 percent willingness to share” personal health information, Newell says. “People do not want their health information shared in any public way.”
Mountain View, Calif-headquartered company works with large corporate clients including Johnson & Johnson, Qualcomm, Activision Blizzard and Red Bull. In addition to gathering medical and biometric data, Jiff uses the search history of employees within the Jiff healthcare system to classify individuals in one of five categories depending on their engagement with their own healthcare: engaged, aspirational, independent, unmotivated, moderate. The company adjusts its outreach to employees based on their personality category.
Beyond manila folders.
Doctor’s offices aren't going back to a world filled with manila folders that could easily be misplaced or stolen. Going forward, we are going to have to find a way to manage in a reality where our healthcare data is online and more and more parties are sifting through and sorting it for us.
Customers of these third-party data analytics companies need to demand that the companies have gone through rigorous third-party cyber security checks, says Johnson, who runs a company that does just that.
Also, one opportunity for the world of healthcare technology to become more secure is for there to be a centrally regulated database that all healthcare data checks against. Then, any malfeasance could be noticed, tracked and flagged. “A great business opportunity is out there right now for somebody right now to solve the problem of the decentralized medical records,” Johnson says.
About 15 percent of employees opt out of using Jiff’s healthcare software, Newell says. But that means that a solid majority of employees working in companies that Jiff serves opt in. Individuals are willing to take on some level of risk in sharing personal data if they get a service in exchange that improves their life, he says.
HealthMine says it’s not in the business of convincing customers to participate in its program. Instead, it presents the potential benefits and lets individuals measure their own temperature for perceived risk.
“What we do is expose the value proposition to them such that it outweighs what they perceive as the risks,” Chang says. “It’s no different from you getting up in the morning and getting in a car, bus or subway and going to work and understanding that there is an innate risk in doing so. It’s not our job to basically tell you what you think is risky or not.”