Minding Our Health: The Nudge, Part Two

Reposted with permission from Wrench in the Gears.

Center for Health Incentives

The topic of the hearing was reducing healthcare expenditures on chronic illness, which they claimed would amount of hundreds of billions of dollars in “savings.” Given the amount of money on the table, it seems clear this sector is ripe for outsourced, outcomes-based contracts that will deploy emerging technologies like health care wearables. Six measures of good health were identified during testimony: blood pressure, cholesterol level, body mass index, blood sugar, smoking status and either the ability to meet the physical requirements of your job (or on this one the Cleveland Clinic person said unmanaged stress.)

This piece expands upon my prior post about digital nudging and behavioral economics. Disruption in the healthcare industry mirrors the ed-tech takeover that is well underway in public education. If you explore the webpage for Catalyst, the innovation PR outlet for the New England Journal of Medicine (remember, social impact policy makers and many investors are based in Boston), you’ll notice the language being used to direct health care providers towards big-data, tech-centered solutions is eerily similar to the language being used on educators and school administrators.

The FCC’s “Connecting America: The National Broadband Plan” of 2010 outlined seven “national purposes” for broadband expansion. Healthcare and education were the first two topics covered in that report. Both chapters focus on “unlocking the value of data.” Who will the big winners be as we further digitize our lives? My assessment is the telecommunications industry and national security/police state will come out on top. Locally, Comcast and Verizon are key players with interests in both sectors.

Education and healthcare fall under the purview of Lamar Alexander’s Senate HELP (health, education, labor and pensions) Committee, so the similarities in tactics shouldn’t come as a surprise. In researching the $100 million federal Social Impact Partnerships Pay for Results Act (SIPPRA) launch I attended in Washington, DC last month, I noticed one of the Republican Senators who presented, Todd Young of Indiana, had attended the Booth School of Business MBA program at the University of Chicago. Recent Nobel Prize winner in behavioral economics Richard Thaler teaches there, and I was curious to see if Thaler’s thinking had influenced Young. Interactive version of Young’s map here.

I located C-SPAN coverage of a Senate hearing on healthy lifestyle choices, which Young participated in on October 19, 2017 (transcript follows). Lamar Alexander and ranking member Patty Murray, who inserted Pay for Success provisions into ESSA, chaired that hearing. Behavioral economics was discussed extensively. Young’s remarks start at timestamp 34:00.

https://www.c-span.org/video/standalone/?435978-1/senate-panel-explores-healthy-lifestyle-choices

The topic of the hearing was reducing healthcare expenditures on chronic illness, which they claimed would amount of hundreds of billions of dollars in “savings.” Given the amount of money on the table, it seems clear this sector is ripe for outsourced, outcomes-based contracts that will deploy emerging technologies like health care wearables. Six measures of good health were identified during testimony: blood pressure, cholesterol level, body mass index, blood sugar, smoking status and either the ability to meet the physical requirements of your job (or on this one the Cleveland Clinic person said unmanaged stress.)

The claim was that if an insured person met four of the six measures, saw a doctor regularly, and had their vaccinations up to date they would avoid chronic illness 80% of the time. Of course the conversation was entirely structured around individual “choice” rather than economic and racial systems that make it difficult for people to maintain a healthy lifestyle.

This neoliberal approach presumes people have free time for regular exercise, not considering they may be cobbling together several gigs to make ends meet. It presumes the availability of healthy food choices, when many black and brown communities are food deserts with limited access to fresh produce. It presumes the stress in people’s lives can be managed through medicalized interventions and does not address root causes of stress in communities steeped in trauma. It presumes ready access to a primary care physician in one’s community.

It is a gross simplification to push responsibility for chronic health conditions solely onto the individual, giving a free pass to social systems designed to harm large subsets of our communities. By adopting a data-driven approach to health outcomes, as would seem to be the case with the above six measures (check a box health), the federal government and health care systems appear to be setting health care consumers up to become vehicles for data generation in ways that are very much like what is happening to public education students forced to access instruction via digital devices. Imagine standards-based grading, but with health measures.

The people who provided testimony at the October 19 hearing included Steve Byrd, former CEO of Safeway, now at Byrd Health; Michael Roizen of the Cleveland Clinic; David Asch Director of the Wharton School’s Center for Health Care Innovation; and Jennifer Mathis of the Baselon Center for Mental Health Law and representing the Consortium for Citizens with Disabilities. Mathis was the only one who testified strongly on behalf of the rights of the insured to withhold personal information and was very concerned about the discriminatory nature of incentivized medical insurance programs, particular with regards to people with disabilities.

In his testimony, David Asch, director of the Center for Healthcare Innovation based in the University of Pennsylvania’s Wharton Business School, described effective designs for health incentive programs, noting that concerns about losing money were more effective from the insurer’s point of view that interest in receiving financial rewards. For that reason Asch said taking away money from someone should be considered before offering a reward. Asch also noted that effective programs included emotional engagement, frequent rewards (tweaked to people’s psychological foibles to they didn’t have to be too large), contests and social norming, including the use of public leader boards.

The date of the hearing is interesting, because right around the same time, public employees (including the teachers) of West Virginia were facing dramatic changes to their insurance plans. These changes included compulsory participation in Go365 an app-based health incentive program that imposed completion of intrusive surveys, wearing a fit bit (if you didn’t there was a $25 fee imposed each month), and meeting a certain step count per day. I include a transcription of testimony from one of these teachers, Brandon Wolford, given at this spring’s Labor Notes conference at the end of this post.

The incorporation of mHealth (mobile health) technologies is a key element of the healthcare disruption process. Increasingly, wearable technologies will transmit real-time data, surveilling the bodies of the insured. mHealth solutions are being built into healthcare protocols, so private investors will be able to track which treatments offer “high-value care.” The use of wearables and health apps also permits corporate health systems to insert digital “nudges” derived from calculated behavioral economic design, into the care provided, and monitor which patients comply, and which do not.

At the moment, the tech industry is working intently to integrate Blockchain technology and Internet of Things sensors like fit bits and health apps on smartphones. Many anticipate Blockchain will become a tool for securing IoT transmissions, enabling the creation of comprehensive and supposedly immutable health data logs, which could be key to mHealth expansion. Last summer the Medical Society of Delaware, a state that touts itself as a Blockchain innovator, announced a partnership with Symbiont, to develop healthcare records on Blockchain. Symbiont’s website claims it is the “market-leading smart contracts platform for institutional applications of Blockchain technology.” The company’s initial seed round of funding took place in 2014 with a second round raising an additional $15 million in May 2017 according to their Crunchbase profile.

The July/August 2018 issue of the Pennsylvania Gazette, the alumni magazine for the University of Pennsylvania, features Blockchain as its cover story, “Blockchain Fever.” The extensive article outlines use cases being considered for Blockchain deployment, including plans by a recent Wharton graduate to develop an application that would certify interactions between healthcare agencies and Medicare/Medicaid recipients for reimbursement. The University of Pennsylvania Health System is deep into innovative technologies. David Asch, director of Penn’s Center for Health Innovation, testified at the October 2017 hearing. The Penn Medicine integrated health system was created in 2001 by former UPenn president Judith Rodin in collaboration with Comcast Executive David Cohen. Rodin went on to head the Rockefeller Foundation, and in the years that followed the foundation spearheaded the creation of the Global Impact Investment Network. GIIN fostered growth of the social impact investing sector, at the same time healthcare began to transition away from a pay-for-service reimbursement towards a value-based model predicated on outcomes met.

Below is a relationship map showing the University of Pennsylvania’s involvement in “innovative” healthcare delivery, which I believe stems from Rodin and Cohen’s connections to Comcast. It is important to note that the Center for Health Innovations claims to have the first “nudge unit” embedded within a health system. Asch is an employee of Wharton, and Wharton is leading initiatives in people analytics, behavior change via tech, and Blockchain technologies. Interactive version of the map here.

New types of employer-based health insurance systems have started to emerge over the past six months. Based on this New York Times article, it seems employees of Amazon, JPMorgan and Berkshire Hathaway will have a front row seat as these technological manipulations unfold. Last fall Sidewalk Labs, the “smart cities” initiative of Alphabet (parent company of Google), announced an expansion into managed healthcare. City Block(read Blockchain) will tackle “urban health” and populations with “complex health needs.”

Reading between the lines, it appears Alphabet aims to use poor black and brown communities that have experienced generations of trauma as profit centers. Structural racism has created a massive build up of negative health outcomes over generations. Now, with innovative financial and technological infrastructures being rapidly put into place, these communities are highly vulnerable. Ever wonder why ACES (Adverse Childhood Experiences) has scores? I expect those numbers are about to be fed into predictive profiles guiding social investment impact metrics.

How convenient that the “smart city” solutions Sidewalk Labs is likely to promote will come with IoT sensors embedded in public spaces. How convenient that healthcare accelerators are developing emerging technologies to track patient compliance down to IoT enabled pill bottle caps; sensors that allow corporate and government interests to track a person’s actions with precision, while assessing their health metrics in excruciatingly profitable detail. Technology platforms are central to City Block’s healthcare program. Many services will take place online, including behavioral health interventions, with the aim of consolidating as much data as possible to build predictive profiles of individuals and facilitate the evaluation of impact investing deals.

Interesting aside, I have two friends who had emergency room visits at Jefferson Hospital this summer and were “seen” by doctors on a screen with an in-room facilitator wielding a camera for examination purposes. This is in a major East Coast city served by numerous research hospitals. Philadelphia is not Alaska. Where is that data going? Where were those doctors anyway?

As these surveillance technologies move full steam ahead, it would be wise for progressive voices invested in the “healthcare for all” conversation to begin considering strategies to address the serious ethical concerns surrounding wearable technologies, tele-health / tele-therapy, and value-based patient healthcare contracting. If guardrails are not put in place that guarantee humane delivery of care without data profiling, the medical establishment may very well be hijacked by global fin-tech interests.

As someone who values the essence of the platform put forward by Alexandria Ocasio Cortez, I worry supporters may not understand that several key elements of her platform have already been identified as growth sectors for Pay for Success. If public education, healthcare, housing and justice reform are channeled by global financial interests into outsourced-based contracts tied to Internet of Things tracking, we will end up in an even worse place than we are now. So, if you care about progressive causes, please, please get up to speed on these technological developments. You can be sure ALEC already has, and remember that Alibaba (Sesame Credit) joined in December. It’s not too much of a stretch to imagine patient rating systems regulating healthcare access down the road if we’re not careful.

Senator Todd Young was the first person to respond to witness testimony during the hearing, and his line of questioning revealed he is a strong advocate of Thaler’s “nudge” strategies. The “nudge” is a key feature of “what works” “Moneyball” government that deploys austerity to push outsourcing and data-driven “solutions” that embrace digital platforms that will gather the data required prove “impact” and reap financial returns. See this related post from fellow researcher Carolyn Leith “A Close Reading of Moneyball for Government and Why You Should Be Worried.”

Young asked David Asch of Wharton’s Center for Innovative Health, what employers could learn from behavioral economists? He also posed several specific suggestions that would scale such programs within the federal government namely: embedding units charged with experimenting with behavioral economics into federal government programs; developing a clearinghouse of best practices; and bringing in behavioral scientists into the Congressional Budget Office.

Asch, a doctor employed by the Wharton Business School, runs UPenn’s Center for Health Care Innovation created in 2011 to test and implement “new strategies to reimagine health care delivery for dramatically better VALUE and patient OUTCOMES” (emphasis added). The 28,000-foot facility houses simulation learning labs and an accelerator where research on use of “smart” hospital systems, social media, and emerging technologies in healthcare is conducted. The accelerator aims to rapidly prototype and scale “high impact solutions,” read Pay for Success.

Besides the Acceleration Lab, the Center also contains the Nudge Unit, which according to their website is the world’s first behavioral design team embedded within a health system. The goal of the unit is to “steer medical decision making towards HIGHER VALUE and improved patient outcomes (emphasis added).” Sample healthcare nudges include embedded prompts in digital platforms (for screenings), changing default settings (to generic prescriptions), framing information provided to clinicians (not sure what this means), and framing financial incentives as a loss.

This is longer than intended and hopefully provides some food for thought. This life datifying impact investing machine we are up against isn’t just coming for public education; it’s coming for ALL human service. We need to begin to understand the depth and breadth of this threat. I’m still mulling over a lot of this myself, and my knowledge base in healthcare is much shallower than my expertise in education. I’d love to hear what folks think in the comments or if you know of others writing on blockchain and IoT in medicine with a critical lens send me some links. Below are transcripts from West Virginia teacher Brandon Wolford about Go365 followed by the Senator Young / David Asch hearing exchange.

-Alison McDowell

Go365 Transcript

Brandon Wolford, West Virginia Teacher: When I first began teaching in 2012 the insurance, in my opinion, was excellent, because I had worked for one year in Kentucky and I had known that the premiums were, although they were being paid five to seven thousand more than we were, they still had to pay much more for their insurance. So it balanced out. However, after the first year or two I was there, that was when they started coming after us with the tax on our insurance. First of all the premiums, we started to see slight increases for one, and another was they started to enforce this “Healthy Tomorrows” policy.

So, the next thing you know, we get a paper in the mail that says, you know, you have go to the doctor by such and such a date. It must be reported. Your blood glucose levels must be at a certain amount. Your waist size must be a certain amount, and if it is not, if you don’t meet all of these stipulations then you get a $500 penalty on your out-of-pocket deductible. So, luckily for me, I eat everything I want, but I was healthy. My wife on the other hand, who eats much better than I do, salads at every meal, has high cholesterol, so she gets that $500 slapped on her just like that.

Okay so, that was how they started out. In the mean time, we have been filling these out for a year or two, and they keep saying you know you have to go back each year and be checked. And then comes the event that awoke the sleeping giants. The PEIA Board, which is the Public Employee Insurance Agency that represents the state of West Virginia, they, um it’s just a board of four to five individuals that are appointed by the governor, they are not elected. They have no one they answer to; they just come up with these things on their own.

So they come to us and they say we’re raising your premiums. This was somewhere between November and December of last year. We’re raising your premiums. You’re going to need to be enrolled in a program called Go365, which means that you have to wear a fit bit, as well record all of your steps. You have to check in with them, and it included private questions like how much sexual activity do you perform, and is it vigorous? All of these things that they wanted us to report on our personal lives, and that was all included. In addition to that we had to report all of those things, and if we refused to wear that fit bit and record all of our steps, or if we didn’t make our steps, we were going to be charged an additional $25 per month.

So, when everyone sees this along with the increased premiums, then they’ve also introduced a couple more bills to go along with that, because the PEIA Board, they have the final say. Whatever they do, it’s not voted upon by the legislature. It’s basically just law, once they decide it. But in the meantime our legislature was presenting these bills. We were currently on a plan of sixty, uh excuse me, eighty/twenty we were paying out of pocket. Well, they had proposed a bill that would double that and make us pay sixty/forty.

So, they presented that along with charter school bills and a couple of other things that were just direct attacks on us. We had been going by a process of seniority for several years; and they also introduced a bill to eliminate seniority to where it was up to the superintendent whether or not you got to stay in your position. It was up to the principal and regardless if you were there thirty years or you were there for your first or your second year…they were trying to tell us you know, it’s just up to your principal to decide. The superintendent decides. They don’t want you to go, you’ve been there for thirty years and you have a masters degree plus forty-five hours, you’re gone. It’s up to them. Your seniority no longer matters. So those things combined with the insurance is actually what got things going in our state.

Excerpted Testimony Healthy Lifestyle Choices, Senate HELP Committee 10/19/17

Lamar Alexander: We’ll now have a round of five-minute questions. We’ll start with Senator Young.

Senator Todd Young: Thank you Chairman. I’m very excited about this hearing, because I know a number of our witnesses have discussed in their testimonies behavioral economics and behavioral decision-making. I think it’s really important that we as policy makers incorporate how people really behave. Not according to an economist per se, or according to other policy experts, but based on observed behaviors. Often times we behave in ways that we don’t intend to. It leads us to results that we don’t want to end up in.

So, Mr. Asch, I’ll start with you, with your expertise in this area. You’ve indicated behavioral economics is being used to help doctors and patients make better decisions and you see opportunities for employers to help Americans change their behaviors in ways they want from tobacco mitigation to losing weight to managing blood pressure and you indicate those changes are much less likely to come from typical premium-based financial incentives and much more likely to come from approaches that reflect the underlying psychology of how people make decisions, encouraged by frequent rewards, emotional engagement, contests, and social acceptance and so forth. And you said in your verbal testimony you haven’t seen much of this new knowledge applied effectively by employers, but there’s no reason why it cannot be. So, my question for you sir is what might employers learn from behavioral economists. Just in summary fashion.

David Asch, Wharton Center for Health Care Innovation: Sure. Thank you senator. I think I’ll start by saying there is a misunderstanding often about behavioral economics and health. Many people believe that if you use financial incentives to change behavior you’re engaged in behavioral economics, and I would say no, that’s just economics. It becomes behavioral economic when you use an understanding of our little psychological foibles and pitfalls to sort of supercharge the incentives and make them more potent so that you don’t have to use incentives that are so large.

So I think that there are a variety of approaches that come from behavioral economics that can be applied in employment setting and elsewhere. I mentioned one, which is capitalizing on the notion that losses looms larger than gains, might be a new way to structure financial incentives in the employment setting in ways that might make it more potent and more palatable and easier for all employees to participate in programs to advance their health. The delivery of incentives more frequently for example. Or using contests or using certain kinds of social norming where it’s acceptable to show people on leader boards in contests and get people engaged in fun for their health. All of these are possibilities.

Senator Todd Young: Thank you very much. You really need to study these different phenomena individually. I think to have a sense of the growing body of work that is behavioral economics. Right, so we need the increased awareness, and I guess the education of many employers about some of these tics we have. That seems to be part of the answer. In fact, Richard Thaler who just won the Nobel Prize for his ground-breaking work in this area indicated that we as policy makers ought to have on a regular basis not just lawyers and economists at the tables where we’re drafting legislation, but ought to have a behavioral scientist as well.

And the UK, they have the Behavioral Insights Team. The United States, our previous administration, had a similar sort of team that did a number of experiments to figure out how policies would actually impact an individual’s health and wellness and a number of other things. Some of the ideas that I think we might incorporate into the government context, and tell me if any of these sort pop for you; if you think they make sense?

We need to continue to have a unit or units embedded within government that do a lot of these experiments. We need to have a clearinghouse of best practices that other employers included might draw on. This doesn’t have to be governmental, but it could certainly be. We on Capitol Hill might actually consider aside from having a Congressional budget office than an official budget office, we might have an entity or at least some presence within the CBO or individuals that understand how people would actually respond to given proposals. Do any or all of those make sense to you?

David Asch: Thank you for your remarks. Yes, I think they all make sense. And one of the lessons that I guess I have repeatedly learned is that seeming subtle differences in design can make a huge difference in how effective a program can be and how it is perceived and that will ultimately care about the impact of these programs. So, I am very much in favor in the use of these programs, but in addition, greater study of these programs, because I think we need an investment in the science that will help all of us in delivering these activities, not just in healthcare, but in other parts of society.

Senator Young: That makes sense. I am out of time. Thank you.

 

Advertisements

Data Unicorns? Tech Giants and US Dept of Ed Form Alliance to Leverage Student Data — Without Parent Consent.

Reposted with permission from Missouri Education Watchdog

Leveraging Student Data

Project Unicorn: Billionaire partners promoting data interoperability and online “Personalized Learning”

When the Unicorns “protecting” student data are interoperable with the Unicorns taking it, parents and lawmakers might want to pay attention.

According to Technopedia, in the Information Technology world, “a unicorn is most commonly used to describe a company, for example, a Silicon Valley startup, that started out small but has since increased its market capitalization to, say, $1 billion or more. …For example, the social media giant Facebook, which has a market capitalization of more than $100 billion, is considered as a “super-unicorn among unicorns”.  Interesting coincidence because the name of a MEGA financed K-12 student data alliance is a unicorn.

Meet Project Unicorn.

Project Unicorn’s Mission is to Leverage Student Data and Make Data Interoperable

Project Unicorn

Project Unicorn’s steering committee is a who’s-who of edtech bundlers, billionaires, and student data power-players. They have formed an “uncommon alliance” committed to leveraging student data by making the data interoperable, flowing seamlessly, between all K-12 applications and platforms. While addressing student data security and privacy is a much needed conversation, it would seem that Project Unicorn has the cart before the horse. There is no talk of student data ownership or consent prior to collecting and using student data but rather, per this press release, Project Unicorn will continue to take the data, make data interoperable and talk about it afterwards, “Once interoperability is in place, we can start working with teachers and students to ask questions about the data.”  You can see by tweets below that Project Unicorn initially claimed it wanted to “shift data ownership to the student”; they have since withdrawn that statement.  Several schools and districts have been encouraged to join the Project Unicorn Coalition; we wonder if parents in these schools were given an option or are even aware of what this means. We’re going to talk about a few of the Project Unicorn partners and then circle back to their interoperability goals and how that fits with student data ownership, ethics, and the newly formed and related Truth About Tech and Humanetech.

A few points before we start:

  • When it comes to “free” edtech products, you know if it is free, you are the product; you pay with your data and your privacy. With edtech and 1:1 devices, personalized learning, online assessments online homework, LMS systems, students usually do not have a choice. Students do not have the ability to consent or opt out. Why?
  • Not all philanthropy is charity. As this article points out, for some, philanthropy is an investment, these nonprofits may “look” charitable but they are truly meant to make money and to buy power and influence policy, and sometimes do harm.
  • McKinsey Global estimated that increasing the use of student data in education could unlock between $900 billion and $1.2 trillion in global economic value. 
  • Children are not data points to predict, standardize and analyze. Currently online platforms can collect every key stroke, analyze and predict children’s behaviors. Children are not meant to be experimented on and#KidsAreNotInteroperable.
  • Currently, students’ data can be shared, researched, analyzed, marketed without parental consent. Often, parents cannot refuse the data sharing, cannot see the data points shared and how they are analyzed.
  • Edtech and Silicon Valley companies can gain access to personal student information without parent consent, under the School Official exception in FERPA. The US Department of Education not only promotes edtech companies, it tells tech companies HOW to gain access to student data, and is partnered in this project to make data sharing interoperable.
  • Interoperable data systems will allow even larger, very predictive data profiles of children–everything they do, are. The best way to protect privacy is to not collect data in the first place. Interoperability, with bigger and more detailed, sensitive data sets, sharing and mixing data with third parties is risky for both privacy and security. The US Department of Education has already warned of cyber hackers ransoming sensitive data from schools; who will be responsible and liable for more data breaches?

Back to unicorns.

How is the US Department of Education involved with Project Unicorn? 

The USDoE (your tax dollars) has been a major driving force of funding and support in online education, and data interoperability. Part of the data interoperability requires common data standards. CEDS (Common Education Data Standards) are codes used to tag student data, you can see these over 1,700 different data codes or elements, in the federal student data dictionary.  These common data tags were created with the help of  Bill Gates, funder of the Data Quality Campaign; read about the mission of DQC at the US Department of Education Summit here. Data Quality Campaign also provides policy guidance to legislators and education agencies, such as this 2018 DQC Roadmap promoting Cross-Agency data sharing. With the shift in education focusing more on workforce talent pipelines (see both ESSA and WIOA), the Workforce Data Quality Campaign (Gates, Lumina, Arnold, Joyce Foundation funded) has also influenced the US Department of Labor. The US Department of Labor-Workforce Data Quality Initiative plans to use personal information from each student, starting in pre-school, via the states’ SLDS data system. You can read more about  the SLDS, the roles that the US Department of Education and Bill Gates play in student data collection, the weakening of federal privacy law FERPA  here. In recent years Microsoft’s commitment to data privacy has been called into question, as per this EdWeek article. Even Microsoft itself admits they can take a peek and trend through student data and can put it on the market.

“If students are using certain cloud infrastructures, and it’s held by a third party, it is possible for [the vendors] to trend through the data,” said Allyson Knox, director of education policy and programs for Microsoft. “When [information] is flowing through a data center, it’s possible to take a peek at it and find trends and put it on the market to other businesses who want to advertise to those students.”

Knox said Microsoft has a “remote data center” where student information is housed but that “students’ data belongs to them.” -Microsoft https://www.fedscoop.com/lawmakers-hear-testimony-on-student-data-and-privacy/                     

Does Microsoft still believe that student data belongs to the student?

Gates: In 5 Years

Microsoft, Bill and Melinda Gates Foundation

The Bill and Melinda Gates Foundation is a nonprofit whose IRS 990 forms can be seen here and (2016) here and TRUST here; their awarded grants can be seen in this searchable database. Gates spends billions on K-12 and higher ed reform. Gates (and Data Quality Campaign) both support a national student database, and now Gates is shifting his Multi-Billion focus from Common Core to K12 networks and curriculum.

(See With new focus on curriculum, Gates Foundation wades into tricky territory .)

Microsoft is desperately hoping to regain ground in the K-12 classroom 1:1 device market, with management systems, cloud, gamification of education (yes, Microsoft owns Minecraft and is promoting Minecraft in classrooms), K-12 LinkedIn Data Badges (yes, Microsoft owns LinkedIn-and yes there are LinkedIn K-12 badge pilots in AZ and CO), introducing chatbots and Artificial Intelligence into education and several online tools like Microsoft OneNote, favorably reviewed here by their unicorn partner Digital Promise. Microsoft is also part of the US Department of Education’s push for online curriculum, via Open Ed Resources OERs. Microsoft will be handling and indexing the content for the Federal Learning Registry. (You can read more about how the Federal Department of Defense and Department of Education are involved in OERs here.)

According to this December 2017 New York Times piece, Microsoft is fiercely trying to regain ground in the K-12 classroom market.

Tech companies are fiercely competing for business in primary and secondary schools in the United States, a technology market expected to reach $21 billion by 2020, according to estimates from Ibis Capital, a technology investment firm, and EdtechXGlobal, a conference company.

It is a matter of some urgency for Microsoft. 

Chromebooks accounted for 58 percent of the 12.6 million mobile devices shipped to primary and secondary schools in the United States last year, compared with less than 1 percent in 2012, according to Futuresource Consulting, a research company. By contrast, Windows laptops and tablets made up 21.6 percent of the mobile-device shipments to schools in the United States last year, down from about 43 percent in 2012. – https://www.nytimes.com/2017/05/02/technology/microsoft-google-educational-sales.html [Emphasis added]

Digital Promise

If you aren’t familiar with Digital Promise, it is a non-profit created by the US Department of Education, to PROMOTE edtech in the classroom. Read about Digital Promise and Global Digital Promise here. Digital Promise is demanding data interoperability for school districts. Digital Promise presented their report The Goals and Roles of Federal Funding for EdTech Research at this 2017 symposium  which was funded by tech foundations and corporations, such as Bill and Melinda Gates, Chan-Zuck, Strada, Pearson, Carnegie… you get the idea.   In their report, Digital Promise acknowledges that the federal government has spent significant money on developing and disseminating technology-based products in the classroom with little to no information on how these products are working.  So, is the answer to rely on tech financed entities and unicorns to review and research the efficacy of future edtech products?  No conflict of interest there. Digital Promise also utilizes the heavily Gates funded and controversial Relay Graduate School, which you can read about here.

The Personalized Learning algorithm driven model does not work.

Digital Promise and others in edtech continue to push for online Personalized Learning despite many warnings from edtech insiders including this from Paul Merich, entitled Why I Left Silicon Valley, EdTech, and “Personalized” Learning. Merich’s concerns with the algorithmic driven Personalized Learning, are summed up with this quote,

“It was isolating with every child working on something different; it was impersonal with kids learning basic math skills from Khan Academy; it was disembodied and disconnected, with a computer constantly being a mediator between my students and me.”

And in this piece by Rick Hess, A Confession and a Question on Personalized Learning, the CEO of Amplify admits Personalized Learning is a failure. We wish every policy wonk and educrat would read this:

…“Until a few years ago, I was a great believer in what might be called the “engineering” model of personalized learning, which is still what most people mean by personalized learning. The model works as follows:

You start with a map of all the things that kids need to learn.

Then you measure the kids so that you can place each kid on the map in just the spot where they know everything behind them, and in front of them is what they should learn next.

Then you assemble a vast library of learning objects and ask an algorithm to sort through it to find the optimal learning object for each kid at that particular moment.

Then you make each kid use the learning object.

Then you measure the kids again. If they have learned what you wanted them to learn, you move them to the next place on the map. If they didn’t learn it, you try something simpler.

If the map, the assessments, and the library were used by millions of kids, then the algorithms would get smarter and smarter, and make better, more personalized choices about which things to put in front of which kids.

I spent a decade believing in this model—the map, the measure, and the library, all powered by big data algorithms.

Here’s the problem: The map doesn’t exist, the measurement is impossible, and we have, collectively, built only 5% of the library.

To be more precise: The map exists for early reading and the quantitative parts of K-8 mathematics, and much promising work on personalized learning has been done in these areas; but the map doesn’t exist for reading comprehension, or writing, or for the more complex areas of mathematical reasoning, or for any area of science or social studies. We aren’t sure whether you should learn about proteins then genes then traits—or traits, then genes, then proteins.

We also don’t have the assessments to place kids with any precision on the map. The existing measures are not high enough resolution to detect the thing that a kid should learn tomorrow. Our current precision would be like Google Maps trying to steer you home tonight using a GPS system that knows only that your location correlates highly with either Maryland or Virginia.

We also don’t have the library of learning objects for the kinds of difficulties that kids often encounter. Most of the available learning objects are in books that only work if you have read the previous page. And they aren’t indexed in ways that algorithms understand.

Finally, as if it were not enough of a problem that this is a system whose parts don’t exist, there’s a more fundamental breakdown: Just because the algorithms want a kid to learn the next thing doesn’t mean that a real kid actually wants to learn that thing.

So we need to move beyond this engineering model…” — Larry Berger, CEO of Amplify, excerpt Rick Hess Straight Up Blog [Emphasis added]

 

And…Digital Promise just published a 2018 report promoting “Personalized Learning”, co-authored by Tom Vander Ark, here.  In this report you can find such gems as this global mantra (including in the US) that learning and teaching knowledge is no longer the main goal of education, it is more important to gather data about how students think and feel.

According to the World Economic Forumthe top five most valued skills for workers in 2020 are: 1) complex problem solving; 2) critical thinking; 3) creativity; 4) people management; and 5) coordinating with others. This is a far cry from simply needing a grasp of reading, writing, and arithmetic to be marketable to employers. While mastery of the three Rs remains critical, it is merely the launching point and no longer the end goal. We need to re-think the education system”  –US Department of Education’s Digital Promise http://digitalpromise.org/wp-content/uploads/2018/01/lps-policies_practices-r3.pdf

Getting Smart, Tom Vander Ark

Tom Vander Ark is Getting Smart author, creator and is the “director of 4.0 Schools, Charter Board Partners, Digital Learning Institute, eduInnovation, and Imagination Foundation, and advises numerous nonprofits.” Vander Ark was also the former Executive Director of Education for Microsoft.  Vander Ark, in this 2011 video said that Common Core’s mandate of online assessments could be used as a lever to get computers into the classroom, computers for personalized learning to help replace teachers. Tom Vander Ark also said gone are the “days of data poverty” once we use online formative tests rather than end of year high stakes tests. Vander Ark is also featured in this Global Education Futures conference; notice that Vander Ark is speaking on how to Unbundle Billions in Education.

Dell Foundation.

What could Dell computers possibly have to do with tech in schools and student data you ask? For starters, Dell funds some heavy hitters in data analytics, such as McKinsey and Boston Consulting Group. Dell also has a “free” app for high school students called Scholar Snap, which handles students’ personal scholarship data. Interestingly, Scholar Snap is also partnered with the Common App, both of which are third party vendors within Naviance, a K-12 Workforce data platform. (You can read about Naviance and their data mining, including how Common App asks students to waive their FERPA rights by clicking here.) Additionally, Dell (along with Gates) helps fund CoSN, the makers of the (industry self-policing, self-awarding) Trusted Learning Environment Seal for Student Data. CoSN  also promotes data collection and personalized learning.  Their “data driven decision making mission” is to “help schools and districts move beyond data collection to use data to inform instructional practice and personalize learning“. Not surprisingly, CoSN is also co-author of this Horizon Report, touting the virtues of Virtual Reality (VR) and robotics and wearable tech, expected to be adopted in K-12 education within the next 3 to 5 years.

The wearable format enables the convenient integration of tools into users’ everyday lives, allowing seamless tracking of personal data such as sleep, movement, location, and social media interactions. Head-mounted wearable displays such as Oculus Rift and Google Cardboard facilitate immersive virtual reality experiences. Well-positioned to advance the quantified self movement, today’s wearables not only track where people go, what they do, and how much time they spend doing it, but now what their aspirations are and when those can be accomplished.”  –CoSN Horizon Report 2018

Side note: It’s not just students who will be required to track and share their biometric and personal data. As this New York Times piece reports, teachers in West Virginia were required to submit their personal information to a health tracking app or risk a $500 penalty.

They implemented Go365, which is an app that I’m supposed to download on my phone, to track my steps, to earn points through this app. If I don’t earn enough points, and if I choose not to use the app, then I’m penalized $500 at the end of the year. People felt that was very invasive, to have to download that app and to be forced into turning over sensitive information.

The Future of Privacy Forum

The Future of Privacy Forum, is a Project Unicorn partner and DC think tank funded by many tech foundations and corporations including but not limited to: Amazon, Apple, AT&T, Comcast, Facebook, Google, Microsoft, Verizon, Samsung, Sidewalk Labs (Google’s Alphabet, Smart Cities), Walt Disney, Bill & Melinda Gates Foundation, National Science Foundation. Hobsons (Naviance), Intel, Palintir, Pearson, Netflix, Mozilla name only a few of their big name supporters. Their K12  arm focuses on balancing student data privacy while supporting innovation and technology in the classroom.

New technologies are allowing information to flow within schools and beyond, enabling new learning environments and providing new tools to improve the way teachers teach and the way students learn. Data-driven innovations are bringing advances in teaching and learning but are accompanied by concerns about how education data, particularly student-generated data, are being collected and used.

The Future of Privacy Forum believes that there are critical improvements to learning that are enabled by data and technology, and that the use of data and technology is not antithetical to protecting student privacy. In order to facilitate this balance, FPF equips and connects advocates, industry, policymakers, and practitioners with substantive practices, policies, and other solutions to address education privacy challenges.

While it is fantastic to have such a well-funded group concerned about student privacy, we wish they would go further. The Future of Privacy Forum  doesn’t advocate for student and parent consent before taking or using student data, nor do they say students should own their own data. We wish they advocated for the right of parents to be ensured paper pencil / book / human face to face teacher alternatives to online curriculum.  We also wish that Future of Privacy Forum would better highlight that predictive algorithms are not regulated or transparent; meta data and personalized, adaptive learning are exempted from state privacy laws, often with this or very similar language:

Nothing in this section

And though the Future of Privacy Forum does promote technology in the classroom, screen addiction is a concern for parents. (Although tech addiction has seen increased media coverage as of late, it’s not new; see this 2015  New York Times article on the toll that screen addiction has on children. However, surprisingly, some would still argue that tech is not addictive. ) When promoting technology in the classroom, the Future of Privacy Forum could do a better job addressing the many well-documented health risks of screen use including behavioral changes, link to teen depression and suicide, sleep disturbance, damage to retinas and vision loss, and better highlight guidance from the American Academy of Pediatricians, warning that wireless devices and cell phones can cause cancer.

Common Sense Media

Common Sense Media is a nonprofit who is supported by several foundations, including but not limited to: The Bezos (Amazon) Family Foundation, The Bill and Melinda Gates Foundation, The William and Flora Hewlett FoundationCarnegie Corporation of NY,  Eli and Edythe Broad Foundation, Michael & Susan Dell Foundation,Overdeck Family Foundation, R.K. Mellon Foundation Symantec ,The Anschutz Foundation,  Annie E. Casey Foundation.  Another of their investors states that, “Common Sense Media provides unbiased and trustworthy information about media and entertainment that helps parents and children make informed choices about the content they consume.”

Can Project Unicorn or any of its Partners truly claim to be unbiased, since they are funded by the data driven tech industry? Since they are in a position to inform and advise on education policy, this is an important question.

Common Sense Media, even after hosting an event about tech addiction, see Truth About Tech below, is still advocating that only certain screen time exposure is addictive or concerning. Common Sense says when it comes to screen time, “there really is no magic number that’s “just right.”   Parents would argue that while content is certainly important, addiction, retinal damage, cancer risk, permissionless data collection, online safety risks apply to both educational and non-educational screen time, and affect children regardless of digital content.

Common Sense Tweet

To their credit, Common Sense Kids Action recently hosted a full day conference (video) on “Truth About Tech– How tech has our kids hooked.” It is great to get this conversation into the spotlight , you can see the agenda here, but there was no mention of giving students and parents ownership and control of how student data is collected, analyzed and shared. With online personalized learning and 1:1 devices being pushed at students as early as kindergarten and preschool, and no laws regulating meta data, data analytics, hidden algorithms, limiting screen time in schools and consent for data collection should have been discussed. Instead, Common Sense along with Project Unicorn is focused on data interoperability to keep the K-12 data flowing and will continue to ask parents to better control children’s screen time use at home.

Common Sense YouTube

The last segment of Common Sense’s Truth About Tech event, entitled “Solutions for Families, Schools, and Democracy” was moderated by Rebecca Randall, Vice President of Education Programs, Common Sense with guest speakers and Common Sense partners Dr. Carrie James, research associate, Project Zero, Harvard School of Education,, and Randima Fernando, Center for Humane Technology. This entire piece is worth your time, Mr. Fernando had some excellent points on gaming and technology.  However, we are going to focus on Dr. James’ comments since, as Ms. Randall mentions, it is on Dr. James’ work regarding digital ethics that Common Sense bases their K-12 digital literacy and citizenship curriculum.  Common Sense Media is about to begin working again with Dr. James and Harvard’s Project Zero to develop updated K-12 digital guidance.

At 49 minute mark,  Dr. James remarks:

“In answering a question around parents as role models, responded that, “We have a growing pile of evidence to suggest that parents are not doing a great job in this regard in recent research that we’re doing with Common Sense we’ve reached out to schools and teachers across the country and in a couple of countries around the world and asked you know what are some of the most memorable digital challenges your schools have faced and a surprising number of them have to do with parents.”

With screens being so addictive, we agree that many parents and most of society undoubtedly could be better screen time role models, we disagree with Common Sense’s continued emphasis only on non-educational screen use. We hope that Common Sense, their partners at Harvard Project Zero who will be working on new digital literacy citizenship curriculum, will consider age appropriate screen use, health and safety guidelines, parental consent and data ownership for children using devices and screens for educational purposes, including online homework. Parents send their children to school expecting them to be safe. Many parents do not want their children required to use screens and technology for regular coursework and when learning core subjects.  Many parents are uncomfortable with online personalized learning and would prefer face to face human teachers and text books as an option. The cost of attending public schools should not be mandatory screen exposure and loss of privacy. We hope that Common Sense will address these concerns in their work.

Project Unicorn is Promoting Interoperability. What is it?

An April 2017 Clayton Christensen Institute blog posted on the Project Unicorn news website explains the path of data interoperability as this,

“The first path toward interoperability evolves when industry leaders meet to agree on standards for new technologies. With standards, software providers electively conform to a set of rules for cataloging and sharing data. The problem with this approach in the current education landscape is that software vendors don’t have incentives to conform to standards. Their goal is to optimize the content and usability of their own software and serve as a one-stop shop for student data, not to constrain their software architecture so that their data is more useful to third parties.

Until schools and teachers prioritize interoperability over other features in their software purchasing decisions, standards will continue to fall by the wayside with technology developers. Efforts led by the Ed-Fi Alliance, the Access for Learning Community, and the federal government’s Common Education Data Standards program, all aim to promote common sets of data standards. In parallel with their (sic) these efforts, promising initiatives like the Project Unicorn pledge encourage school systems to increase demand for interoperability.”  [Emphasis added] https://www.christenseninstitute.org/blog/making-student-data-usable-innovation-theory-tells-us-interoperability/

A one-stop shop for student data, flowing seamlessly for third parties: Interoperability. 

How will  Project Unicorn help give students ownership of their data? Will students have consent and control over their data? We asked. 

Interestingly, up until a few days ago, Project Unicorn’s twitter profile stated that their focus is “shifting the ownership of data to schools and students.” See this screenshot from February 18, 2018 and a twitter conversation below.

Project Unicorn Tweet 2Project Unicorn replied the following day but they did not immediately answer my question about student data consent and ownership. Instead, they listed a few of their partners: Data Quality Campaign, Future of Privacy, Common Sense Media, National PTA. Again, I asked them about their statement about shifting ownership of data to the student.

Project Unicorn Tweet 3

Project Unicorn Tweet 4

Gretchen Logue also replied to Project Unicorn and their partners, asking if students can NOT have their data shared. Two days later, she still had not received a reply.

Logue

I directly asked Project Unicorn’s partner, Digital Promise to help answer whether students can consent to data collection. (Remember, DP is the edtech /personalized learning promoting non-profit created by the US Department of Ed.)  Digital Promise never responded to this parent’s questions. Maybe they just need a little more time or maybe parents aren’t important enough to bother with?

Tweet 5

tweet 6

tweet 7

Project Unicorn replied: they changed their twitter profile to better reflect the scope of their projectThey no longer claim to shift data ownership to students. They are promoting data interoperability. To be clear: they are NOT giving students ownership of their data. See their new twitter profile in this February 23, 2018 screen shot below.

Project Unicon interoperability

Why do edtech companies and our government have such a problem giving students consent and true ownership of their data? Data is money. Data is identity.  Student data is NOT theirs to take. 

Without the student, the data does not exist. If a student writes an essay for a class assignment, that written work belongs to the student. If a student draws a picture in art class, that artwork is theirs. Parents (and the Fourth Amendment) would argue that personal information about a student, created by a student, should belong to the student.

#TruthinTech: Unicorns are taking student data and sharing it without consent. What say you @HumaneTech?

Humane tech

Tech is hacking kids brains, but it is also stealing their data, students’ every keystroke can be collected and analyzed and student education records can be shared.  (FERPA is a 40 year old law that doesn’t cover data or meta data, or algorithms and was substantially weakened  in 2011 to allow personally identifiable information to be shared outside of the school with nonprofits, researchers, anyone approved as a school official or  educational purpose–without parent consent or knowledge). HumaneTech folks, are you good with this predictive profiling, leveraging and capitalizing of children who are held hostage in this mandatory surveilled school system? Schools are the new smart cities –except children are a captive audience and they are being exploited. They have no choice.

Why not do real, independent research, set guidelines and protect kids from screens in schools? Why not give parents and students a choice of tech vs paper, allow the option of learning knowledge vs in-school personality surveys and emotional assessments and biometric health trackers? Why not be transparent about algorithms and analytics and get consent BEFORE collecting and using student or teacher data?

GDPR.

Europe requires consent before collecting and sharing personal data, including automated decision making. GDPR gives Europeans (including students) more control on how their data is handled, including breach notification and penalty, data redaction, and consent. Why would American students be any less deserving than students in Europe? GDPR will have global implications.  Modernizing FERPA and COPPA to align with GDPR would be both practical and ethical. Why isn’t Project Unicorn also advocating for the GDPR standard of basic human privacy and data identity rights for American citizens and children? 

A final question note. Project Unicorn is not an elected, governing body, are they directing US education policy? Decisions should be made democratically, by those closest to the children, instead of by a few billionaires. What gives philonthro-funders the right to leverage children’s data and encourage schools with their procurement $trategies? The Edtech Billionaires directing education-experimenting on children have created (and are profiting from) this data driven problem: teachers are so busy collecting endless data points they don’t have the time or the freedom to teach. Now the regretful tech industry, wants to swoop in and make the data collection process easier, free up teachers (or replace them?), with a Single-Sign-On Standardized data collection tool. Children are not a product to be leveraged.  Please stop using schools and children as a permissionless innovation data supply.

IMS Global

And why oh why, Project Unicorn, are you working with IMS Global?  Uncommon Alliance indeed.

“…interoperability specification for educational click stream analytics created by the education community for the education community. Major educational suppliers are using Caliper to collect millions of events every week and the data is helping to shape teaching and learning on multiple levels. Several leading institutions are also working on putting Caliper in place. Now is a great time for both institutions and suppliers to begin putting learning analytics in place using Caliper.”

IMS Global Learning Consortium

-Cheri Kiesecker