Digital Nudging: Data, Devices & Social Control

Reposted with permission from Wrench in the Gears.

Digital exhaust, virtual selves

…“Choice architects” create these systems and weave them into public policy. Through strategic application of “nudges,” citizens,  otherwise “irrational actors” in the market, can be guided to conform to economists’ expectations. Through nudges, human behaviors are redirected to fit mathematical equations and forecasts….

The way we live our lives generates enormous amounts of data. Keystrokes; online payments; photos with embedded meta-data; cell tower pings; fit bits; education management apps; search histories; avatars; social media posts all contribute to a cloud of digital exhaust that threatens to engulf us. Our world is being increasingly data-fied as smart phones mediate our daily activities, and Internet of Things (IoT) sensors become integrated into our homes and public spaces.

In the coming decade we’re going to have to navigate environments defined by ubiquitous computing and surveillance. Virtual and real worlds will meld in unsettling ways. The threat of state repression will intensify, especially for black and brown people, immigrants, refugees, the poor, and dissidents. As the former CIO of the City of Philadelphia Charles Brennan noted at the end of an October 22, 2017 meeting, the future of policing will encompass predictive analytics, facial recognition software, and drone surveillance.

With UPenn’s GRASP lab currently managing a $27 million contract with the US Army Research Lab to develop distributed intelligence, autonomous weapons, it’s not too soon to be thinking about what comes next. To get a feel for where we could be headed, the write up, “Singapore, City of Sensors” describes what it’s like to live in a “smart nation”  where EA3 devices track “Everyone, Everywhere, Everything, All The Time.”

Bits and bytes of data build up like passes from a 3-D printer; and as the data is aggregated, our digital doppelgangers emerge. Of course they’re merely shadows of our true, authentic selves. They magnify certain aspects of our personalities while suppressing others. The data of our online counterparts can be incorrect or incomplete, yet even with all those flaws our online profiles and reputations have begun to profoundly influence our offline lives.

As Eric Schmidt of Alphabet (Google’s parent company) says: data is the new oil, so valuable nation states will fight over it. From Cambridge Analytica to Cornell-Technion’s Small Data Lab to Wharton’s Behavior Change for Good program, social scientists are teaming up with venture capital, government agencies, and NGOs to devise new and intrusive ways to monitor people and extract profit from the management of our data-filled lives.

The relationship map below (click here for the interactive version) features individuals and organizations associated with the Small Data Lab, a program of Cornell-Technion based on Roosevelt Island in New York City. This research and development program is backed by influential impact investors and technology companies, including Google. If you know your way around social impact bonds, you’ll see quite a few familiar names: Goldman Sachs, Bloomberg Philanthropies and Atlantic Philanthropies. The aim is to come up with sophisticated ways to analyze digital exhaust and devise technological “solutions” that pressure individuals to conform to neoliberal economic conditions. The technological underpinnings of these app-ified “solutions” enable the capture of “impact metrics” that will fuel the growing social investment sector.

Cornell-Technion also aims to grow the STEM/cyber-security human capital pipeline, having recently accepted at $50 million gift from Tata Consulting, one of India’s most highly-capitalized IT companies, to build an innovation center on their campus. The program plans to do outreach into New York City schools to promote skill development in AI and human-computer interaction.

PTB Ventures, Project Trillion Billion, is one example of a company positioning itself for this new market. A financial backer of Learning Machine, spun out of the MIT Media Lab and specializing in Blockchain education credentials, PTB has also invested in Callsign (digital identity authentication), Element (biometrics), and DISC Holdings (digital payments and credit on blockchain). Their website states the company anticipates a future where trillions of devices will be connected to billions of humans and create trillions of dollars in economic value. These investors hope to use connected devices and sensors to mine the lives of the global poor and dispossessed for the economic benefit of the social impact and fin-tech sectors.

Proposals for online platforms are beginning to emerge that aim to combine decentralized identifiers (DIDs used to create self-sovereign digital identities), e-government transactions, and online payment systems (including public welfare benefits) with “digital nudges” grounded in behavioral economics. See the screenshot taken from the Illinois Blockchain Task Force’s January 2018 report. It shows a desire to digitally incentivize healthy eating purchases for people receiving SNAP benefits.

Behavioral economics is the study of how psychological, cognitive, emotional, social, and cultural factors influence the economic choices a person makes. It challenges the idea of homo economicus, that people maintain stable preferences and consistently make self-interested choices in relation to market forces. The field was popularized in the United States by Nobel-prize winning psychologist Daniel Kaheneman. University of Chicago economist Richard Thaler built upon this work. Thaler won a Nobel Prize in Economics for his research last year.

Thaler worked closely with Cass Sunstein, who headed Obama’s Office of Information and Regulatory Affairs. In 2008, they co-wrote Nudge, a book espousing “libertarian paternalism.” People make “choices,” but systems can be designed and implemented to encourage a preferred “choice,” generally one that prioritizes long-term cost-savings. “Choice architects” create these systems and weave them into public policy. Through strategic application of “nudges,” citizens,  otherwise “irrational actors” in the market, can be guided to conform to economists’ expectations. Through nudges, human behaviors are redirected to fit mathematical equations and forecasts. David Johnson’s 2016 New Republic article Twilight of the Nudges, provides useful background on this technique and the ethical implications of applying nudges to public policy.

Sunstein Obama

The first “nudge unit” was established in the United Kingdom in 2010 as the Behavioural Insights Team (BIT). It operated as a cabinet office for several years before reinventing itself as a global consultancy in 2014. BIT is now owned in equal parts by staff, the UK government and NESTA, a social policy innovation / impact investing foundation funded with proceeds from the UK lottery system. Thaler is on their Academic Advisory Team. From 2015 to 2018 BIT had a $42 million contract with Bloomberg Philanthropies to support development of their “What Works Cities” initiative in the United States. Results for America, the organization that co-hosted the $100 Million “Pay for Success” celebration in Washington, DC last month, currently manages the What Works Cities program on behalf of Bloomberg Philanthropies.

Ideas42 has also been very active at the intersection of social science, behavioral economics and impact investing strategies. It was founded in 2008 as a program of Harvard University with support from scholars and experts at MIT, Princeton, the International Finance Commission (IFC), and the Brookings Institution. Focus areas include education, healthcare and financial inclusion. Numerous mega-philanthropies that are actively implementing the Ed Reform 2.0 agenda have partnered with the organization: Gates, MacArthur, Arnold, Lumina, HP, and Dell. Other partners are involved in deployment of global aid: USAID, the World Bank, the International Rescue Committee (see my previous post re BIT and IRC involvement with Syrian refugee children), and the UN Environment Programme. There are representatives of global finance including Citi Foundation and American Express; insurance companies, MetLife and the Association of British Insurers; and impact investors focused health and wellness, the Robert Woods Johnson and Kellogg Foundations.

Over one hundred experts are allied with this program, including Angela Duckworth and Katherine Milkman of the University of Pennsylvania. They created the ninety-second video “Making Behavior Change Stick” as part of their application to the MacArthur Foundation’s $100 Million and Change challenge. While the proposal was not a finalist, Duckworth and Milkman’s research continues to move forward with private support, housed within the Wharton Business School. Their first $1 million came from the Chan Zuckerberg Initiative (founded with Facebook stock), that interestingly enough is also currently working with the Philadelphia District Attorney’s office (Larry Krasner) on criminal justice “reform.” More opportunities for our technological overlords to encourage “good” decision making while completely disregarding “broken on purpose” social programs, I suppose.

Take note of the partners identified in Duckworth and Milkman’s MacArthur proposal:

Duckworth and Milkman’s premise is that technology can be used to encourage people to make “good choices,” which the begs the question, “Good for whom?” I suspect what will make a certain choice “good” is the likelihood it will enrich social impact investors while furthering the austerity that drives reduction in public services, increases outsourcing, and fosters the creation of public-private partnerships. The desires of those needing to access services will not be factored into the computer code that sets up friction points and establishes preferred outcomes. Citizens are simply inert, raw material to be molded, for profit, by inhumane digital systems. In the nudge model, economic systems that create mass poverty are not addressed. Instead, the impetus is placed upon the individual to better navigate existing systems steeped in structural racism.

As you may remember from my previous post, Duckworth has been working closely with human capital and labor economist James (7-13% ROI on Early Childhood Education Investments) Heckman. She is one of five leaders of the “Identity and Personality” division of his Human Capital and Economic Opportunity Working Group, based out of the University of Chicago and funded by the Institute for New Economic Thinking (INET). In May 2017, Duckworth brought an interdisciplinary group of experts in behavior change to the University of Pennsylvania for two-day conference sponsored by the Center for Economics of Human Development. Fourteen presentations, including  a “Fireside Chat With Daniel Kahneman” were recorded and are viewable here.

The prior year, Philadelphia became the first city in the US with its own municipal level “nudge unit.” Though Duckworth does not appear to be directly involved, Evan Nesterak, a researcher in Duckworth’s Characterlab, co-founded The Philadelphia Behavioral Science Initiative (PBSI) with Swarthmore Professor Syon Bhanot. Bhanot is involved with theSwarthmore Professor Syon Bhanot, as well. According to a 2018 report on PBSI published by Results for America, the initiative’s other academic partners include: the University of Pennsylvania, Drexel, Temple, St. Joseph’s, Yale, Columbia and Princeton. The report, viewable here, was funded by the John and Laura Arnold Foundation. John Arnold, a hedge-fund billionaire who made his fortune at Enron, has since moved on to education reform, gutting public pensions, and promoting pay for success “evidence-based” finance.

“Innovative” programs are being incubated within the planning and policy departments of many US cities now via fellowships and loaner “experts” who plan to advance an “evidence-based,” “big-data,” “platform-government” agenda. Anjali Chainani, Mayor Kenney’s Policy Director and Manager of the city’s GovLab, has gone through the Results for America Local Government Fellow program.  The Philadelphia Behavioral Science Initiative is an outgrowth of the City Accelerator and GovLabPHL, which she manages. While the initial program areas are strategically uncontroversial (it would be difficult to speak against seniors taking advantage of discounted water bills or public bike sharing), it seems likely an “evidence-based” campaign of nudges, once normalized, will be extended into more lucrative and ethically-dubious areas like policing, health care delivery, family services, and behavioral health.

Below is an extensive relationship map that shows interconnections between data-driven public policy / privatization programs originating out of the Harvard Kennedy School of Government, the global financial interests represented by the members of Citi Group’s “Living Cities” program, and how those interface with government operations in the city of Philadelphia. Many of these programs were put into place by our former mayor, Michael Nutter, who went on to become a senior fellow for Bloomberg’s “What Works Cities” program. His wife Lisa is now a principal with Sidecar Social Finance, an impact investing firm.

Click here for the interactive version.

Feeding this machine is our gradual yet irresistible slide into a financial world of digital economic transactions. My next post will focus on that. Please take some time to explore the maps above. They are complex but convey a great deal about the forces at work. Sometimes a nudge is actually a shove. I think our city is being positioned for some serious shoving.

The footage above is from the violent July 5, 2018 police intervention against peaceful OccupyICEPHL protestors at 8th and Cherry Streets outside Philadelphia’s ICE detention center.

-Alison McDowell

Advertisements

What Will Facebook Terragraph, 5G, and Being a “Smart City” Mean for San José Residents?

Reposted with permission from EduResearcher.

smart city san jose

A quote from the Smart City Team presentation in April on the Facebook Terragraph (millimeter wave technology) rollout reveals that “deploying at scale in a city has never been done before.”  This alone should lead us to ask critical questions about the process and outcomes. To what extent have residents been informed about known risks and hazards of new technologies that they will apparently be subject to, and what kinds of concerns about safetysecurity, and privacy (or lack thereof) are being contemplated by city leaders as they make final decisions to either fully deploy or hold off on the Smart City experiments? Are cities with tech partnerships exempt from needing to uphold basic standards of protection of human participants in experimental research?

Silicon Valley’s philosophy to “move fast and break things” may not be readily apparent upon landing on the Smart Cities Vision page for San José, but a closer look at key proposals reveals it’s likely in the mix. While difficult to know how day-to-day life will change as a result of living in a “smart city,” the issues described below are certainly worth learning more about. What should residents expect as tangible benefits? What will be the costs? What blind spots may exist among well-intentioned leaders making decisions, and will there be unintended (or consciously dismissed) harms resulting from these initiatives?

precise definition of a “Smart City” remains elusive, yet what does appear at the root, is that 5G will be involved. A recent Bloomberg update documents tensions between big business and government in the rollout of 5G, with a focus on San Jose’s role in initially participating with, and then protesting, the Broadband Deployment Advisory Committee of the FCC. It appears despite the recent resistance, that industry dominance is not solely an issue within the FCC, but also influential in shaping local policies for 5G deployment.

…”For San Jose, the march toward 5G continues without the FCC. On Monday, the city struck an agreement with AT&T to install about 200 small-cell devices for 5G on light poles in exchange for $5 million in lease revenue over 15 years. Perhaps the worst part of the whole process, said San Jose Mayor Liccardo, is that most Americans aren’t paying attention: “When you’re talking about complex issues of technology and regulation, it’s often lost on the public just how badly they’re being screwed.”

According to a February 2018 report by Grand View Research, the global smart cities market is anticipated to reach approximately 2.6 trillion dollars by 2025. A summary of the report indicates key industry participants to include tech giants such as Accenture, Cisco Systems, Siemens, IBM, General Electric, and Microsoft.  What appears missing in the summary, however, is the specific situation for San José, where apparently Facebook will also be a main driver and beneficiary of the Smart Cities plan.

A quote from the Smart City Team presentation in April on the Facebook Terragraph (millimeter wave technology) rollout reveals that “deploying at scale in a city has never been done before.”  This alone should lead us to ask critical questions about the process and outcomes. To what extent have residents been informed about known risks and hazards of new technologies that they will apparently be subject to, and what kinds of concerns about safetysecurity, and privacy (or lack thereof) are being contemplated by city leaders as they make final decisions to either fully deploy or hold off on the Smart City experiments? Are cities with tech partnerships exempt from needing to uphold basic standards of protection of human participants in experimental research?

See the following overview of the Facebook Terragraph here: “Introducing Facebook’s new terrestrial connectivity systems — Terragraph and Project ARIES” and a video introduction linked to the image above. To read more about Facebook’s partnership with San José, see documents from the April 5th Smart Cities Meeting).

Below is a list of concerns related to the Smart Cities and 5G rollouts. Specific questions are provided at the end of this post. 

A. Public Health Impacts:
1. Scientists and Doctors Demand Moratorium on 5G (original)
(Örebro, Sweden) Sept. 13, 2017 
“Over 180 scientists and doctors from 35 countries sent a declaration to officials of the European Commission today demanding a moratorium on the increase of cell antennas for planned 5G expansion. Concerns over health effects from higher radiation exposure include potential neurological impacts, infertility, and cancer.  Dr. Joel Moskowitz, Director of the Center for Family and Community Health at UC Berkeley, recently announced an additional statement from the International Society of Doctors for the Environment and its member organizations in 27 countries adding to the call for a halt to the rollout of 5G.  In the United States, the ISDE member organization is Physicians for Social Responsibility (PSR). There are now over 200 signatories to the original appeal. See the main website here.

2. 5G Wireless Technology: Millimeter Wave Health Effects

3. To learn more about concerns related to 5G and “Internet of Things” technologies, listen to the audio of the following Commonwealth Club discussion held on February 5th, 2018 in San Francisco, CA: ReInventing Wires: The Future of Landlines and Networks and read the report here published by the National Institute for Science, Law & Public Policy.

4. 5G Wireless Telecommunications Expansion: Public Health and Environmental Implications (in press), Environmental Research. Abridged version available via Bulletin of the Santa Clara County Medical Association, re-shared with permission from author: A 5G Wireless Future: Will It Contribute to a Smart Nation or Contribute To An Unhealthy One?

5. 5G: Great Risk for EU, U.S. and International Health: Compelling Evidence for Eight Distinct Types of Harm Caused by Electromagnetic Field (EMF) Exposures and the Mechanism that Causes Them


B. Big Data Security Issues:

The following screenshot is from the proposed Data Architecture Report with examples of the key platforms being proposed to house data (City of San Jose Open Data Community Architecture Report – 2/2018, p.5)

Below are examples of major security/data breaches from several of the proposed platforms:

C. Privacy Risks:

Smart Cities Come With Inherent Privacy Risks, ACLU Says
Making Smart Decisions About Smart Cities (ACLU of Northern California)
“Smart Cities”, Surveillance, and New Streetlights in San Jose (Feb. 2017, Electronic Frontier Foundation)

Privacy International Reports (with key word search for “Smart Cities”)
Selected Posts/Reports:

The following slide was provided by the Smart Cities Team during the presentation on Privacy at the Smart Cities Committee meeting on April 5th, 2018.

Posted on the slide:
“Many questions remain for us to consider…
* Who owns the data?
* What is our retention policy?
* Where is it housed?
* Who are we sharing the data with?
* Should we have a data monetization strategy?
* How are we managing Big Data?
* Chief Privacy Officer?”

It should be acknowledged that the Smart Cities team has been responsive to offer meeting with community members who have raised concerns. Representatives from the ACLU and NAACP have been invited for individual conversations after making comments at recent public meetings and the City’s Deputy City Manager, Kip Harkness has written a blogpost on pending projects with key questions at the end related to the need for public involvement.

In the spirit of community engagement, please find the following questions for city leaders and the Santa Clara County Board of Supervisors. (Responses will be posted as soon as they are made available).

1. Does the city have evidence to document the safety of experimental technologies being deployed in light of the biological risks/hazard warnings raised by over 200 scientists who have recommended a halt to the deployment of 5G/millimeter wave technologies? (See hereherehereherehere and Section A above for more).

2. Have alternative solutions to high-speed connectivity been explored (outside of the FB Terragraph/5G/IoT rollouts)? Listen to the audio of the following event from the Commonwealth Club outlining science and policy gaps in addressing these issues: ReInventing Wires: The Future of Landlines and Networks and read the full report here.

3. Will residents be allowed an opportunity to “opt out” of having internet devices being connected to the Terragraph by the City or Smart City Technology partners? Or will everyone within city limits be subject to their information being swept up into the data-gathering structures?

4. How will the City of San José justify using platforms for Smart City data architecture that are a) explicitly connected to Amazon commerce sites and b) that have been repeatedly vulnerable to massive data breaches? (see Section B above)

5. What assurances will be provided to ensure data extracted from the Smart Cities program and/or devices connected to the Internet of Things/5G networks will not be used in ways that would harm vulnerable communities? (Click image below for concerns. See also Data Justice Lab and the Data Harms Record).

6. Eubanks’ recently published book,“Automating Inequality: How High Tech Tools Profile, Police, and Punish The Poor” documents ways that structural discrimination is being exacerbated by the introduction of new technologies and related policies shaped by algorithmic error and bias. What processes will the Smart Cities team enact to ensure algorithmic transparency for the public to know how data will being used in analyses or decision-making?  Will the City of San José agree to abide by the following Principles for Algorithmic Transparency and Accountabilitypublished by the Association for Computing Machinery U.S. Public Policy Council?

7. Why is the City of San José partnering with Facebook to deploy untested Terragraph 5G millimeter wavelength technology “at scale throughout the city” given its clear record of betrayal of public trust with privacy violations that allowed data from 87 million users’ profiles to be abused and misused?  

8. During the FB Terragraph presentation at the April 5th meeting, the Facebook representative indicated that specific data would not be extracted, rather that amounts of data sent/received would be monitored via the Terragraph. What evidence can be provided to verify such claims aside from the verbal promises? Have data contracts for the Terragraph project been analyzed and vetted by non-industry-funded privacy/security experts?

9. Will the contracts for the Facebook Terragraph partnership and AT&T 5G small cell rollouts with the City of San José include liability disclaimers similar to these earlier ones from telecom companies? If so, would the City of San José then be held liable in case of harm inflicted on residents as a result of the technologies being deployed (and would the city be able to afford such liability at a large scale)?  Note that similar issues were raised when SB649 was considered at the State level and was eventually vetoed by Governor Brown.

10. With the exception of four city employees working on the Smart Cities project and the Mayor, San José’s Smart Cities Advisory Board consists entirely of individuals from the tech industry without any representation from community based organizations, academics, scientists, public health professionals, independent privacy/security experts, or civil/human rights organizations. How will city leaders be more intentional about structurally integrating community into the process of decision-making related to the Smart Cities Initiatives? 

Smart Cities Advisory Board
Vice President, AERIS
Director and Head of IoT Investment Fund, Cisco Investments
Chief Technology Officer for Smart Cities, Dell EMC
Vice President and General Manager, IoT, Intel
Senior Vice President, ORBCOMM, Inc
Founder and Chief Technology Officer, RevTwo
Former Chief Technology Officer, PayPal
Vice President, CityNOW, Panasonic

For readers interested in more information about Facebook’s reach, see the following maps and analyses by the Share Lab: Research and Data Investigation Lab.

Screenshot of image above is from the following article via the BBC.
For main Share Lab site, see https://labs.rs/en/.

_____________________________________________
May 26th Update: Since the original publication of this post, concerns have also been raised about the use of facial recognition technologies throughout the Smart Cities projects (at the May 3rd Smart Cities meeting where Box software’s facial recognition was proposed in a pilot demonstration and more recently from ACLU documents that link the use of Amazon’s Reckognition software with Smart City plans in Orlando, Florida). The video below was originally posted to the ACLU YouTube channel with the title Amazon Sells Facial Recognition Tech To Police. More detailed information with concerns about facial recognition technologies can be found here and here.  is currently unclear whether or not the City of San José is using (or plans to use) the Amazon Rekognition facial recognition technology throughout the city. The video below from the ACLU does indicate that the City of Orlando is a “Smart City” that is already using the Rekognition technology.

…”It also already has surveillance cameras all over the city on everything from light posts to police officers. Activating a citywide facial recognition system, could be as easy as flipping a switch. Body cams were designed to keep police officers accountable to the public, but facial recognition turns these devices into surveillance machines. This could mean round-the-clock surveillance whenever cops are present. Imagine what that would mean for minority communities that are already over-policed.”…

The following is a quote from a letter dated May 25th, 2018 from US Congressmen Ellison and Cleaver to Amazon CEO Jeff Bezos:

“According to a page on the Amazon Web Services (AWS) website, Rekognition is a “deep learning-based image recognition service which allows you to search, verify, and organize millions of images.” The same web page describes Rekognition as a tool for performing “real time face searches against collections with tens of millions of faces.” Amazon’s website lists the Washington County Sheriff’s Department and the City of Orlando Police Department as Rekognition customers. A series of studies have shown that face recognition technology is consistently less accurate in identifying the faces of African Americans and women as compared to Caucasians and men. The disproportionally high arrest rates for members of the black community make the use of facial recognition technology by law enforcement problematic, because it could serve to reinforce this trend.”… 

The letter continues with a series of questions requested to be answered by Bezos prior to June 20th, 2018. To learn more about Amazon’s Facial “Rekognition” program, click either of the images below:
Privacy statement. Clicking the above images will serve content from youtube.com

For additional reading, see: 
Dr. Beatrice Golomb, Professor of Medicine, UCSD, Letter to Oppose 5G (SB649)
5G Wireless Telecommunications and Expansion: Public Health and Environmental Implications
Why We Should Oppose 5G on Health Grounds // Ronald M. Powell, Ph.D.
Smart Cities, Social Impact Bonds, and the Takeover of Public Education 
The 5G Appeal: Scientists and Doctors Call For a Moratorium On The Roll-Out of 5G
Smart or Stupid? Will the Future of Our Cities Be Easier to Hack? // The Guardian
Philadelphia’s $4,000 Trash Cans A Messy Waste
The Disinformation Campaign and Massive Radiation Increase Behind the 5G Rollout // The Nation [Investigative Report]
Why Smart Cities Need an Urgent Reality Check // The Guardian
The Color of Surveillance in San Diego // San Diego ACLU 
Amazon Pushes Facial Recognition to Police. Critics See Surveillance Risk // New York Times
Together We Can Put A Stop to High-Tech Racial Profiling // ACLU
Amazon Confirms That Echo Device Secretly Shared Users’ Private Audio
Amazon Needs To Come Clean About Racial Bias In Its Algorithms
Emails Show How Amazon Is Selling Facial Recognition System to Law Enforcement. Broad coalition demands that Amazon stop selling dragnet surveillance tool to the government, citing privacy and racial justice concerns

The next San José Smart City meetings will be held from 1:30-4:30pm in the City Hall Chambers on June 7th, 2018.

-Roxana Marachi, Ph.D

Editor’s Note: Seattle is part of the White House Smart City Initiative. It may not follow the exact blueprint of what’s happening in San Jose, but I think it’s important to start thinking about and asking questions concerning data collection, ownership and monetization at the city level. Personal privacy is a big concern as well. If every public action can be monitored and recorded it’s not hard to see smart cities quickly evolving into surveillance cites.

-Carolyn Leith

“Smart and Surveilled:” Building Sanctuary Part 3

Reposted with permission from Wrench in the Gears.

smart and surveilled

The future is uncertain and unlikely to play out exactly as described. Nevertheless, we must begin to comprehend how technological developments combined with concentrated power and extreme income inequality are leading us to increasingly automated forms of oppression. My hope is that communities will begin to incorporate an understanding of this bigger picture into resistance efforts for public education and beyond. Let us join together, embracing our humanity, to fight the forces that would bring us to “lockdown.” How can we preserve our lives and those of our loved ones outside the data stream? How can we nurture community in a world where alienation is becoming normalized? What do we owe one another? What are we willing to risk? I have divided my story into seven parts. I hope you’ll read along and consider sharing it with others.

This installment highlights  smart city surveillance and the Internet of Things. Cam and Li’s lives, including their educational experiences, are shaped by ubiquitous algorithms that align their behaviors to the economic and social expectations put in place by the Solutionists. This is the third installment in the series. If you want to read from the beginning use this link to access the introduction and Part 1: Plugging In. The whole series can be accessed here: Link

Cam and Li have grown up in a world controlled by sensors and data. All day, every day sensors watch, track and transmit information. The devices that make up the vast web of Internet of Things are tiny, but their combined power is incalculable. The most common IoT sensor in the pre-lockdown years was the smart phone. Practically anyone over the age of ten had one. Acting as a sensor, people’s phones were a primary means of data collection, logging information about how people interacted with each other, with systems, and their physical world.

The first sensors were created to monitor global supply chain shipments. Then, corporate, government and academic researchers devised a dizzying array of sensors to transmit data about most aspects of the physical world and how people live their lives in it. Instead of tracking pallets on cargo ships, they now track people, buses, energy, animals, art, storm water runoff, even sounds and footsteps. Each processor gathers a particular type of information that can be merged into the data stream for analysis. Predictive analytics algorithms, complex mathematical equations that anticipate future outcomes, tap into the data stream. Such algorithms can be used to predict when the bulb in a streetlight will fail, when a storm sewer will overflow, or even where a crime will happen.

For years authorities quietly built datasets that digitally documented community life using police body cameras and later cameras embedded into robot patrols. It showed incredible hubris to roll out such a program under the guise of citizen protection. The cameras, of course, were always looking out at the people, not at the police. Even with footage, police were rarely held accountable for crimes committed. Meanwhile, all aspects of people’s daily lives were taken in; faces, routines, social connections; anything within the field of view of the camera was absorbed by Oracle.

That such data would be turned against citizens in times of civil unrest should have been anticipated. Some who lived in communities that had experienced the evolution of brutal policing were indeed skeptical, but many held on to the idea that the cameras were well intentioned. Cam’s mother vividly remembers the week of the lockdown, how teams were deployed strategically throughout the city in ways that made resistance futile. All those years, the police state’s neural networks had been “learning” their neighborhoods and their faces all in the name of public safety.

Post lockdown, sensors and technology have been integrated into more and more aspects of daily life, pressuring people to make “good decisions.” Strivers feel less and less in control of their daily activities. They await the next haptic pulse that will direct their attention and actions. Cam might crave a pint of chocolate ice cream, but her minder is watching the refrigerator and uses guilt to pressure her into choosing carrots and celery instead. If she doesn’t comply, it will most certainly go into her health data log. Maybe Li wants to sleep late. Well, the sleep monitor strives to keep her on a productive R.E.M. cycle, so it raises the shades in her bedroom and turns on the shower down the hall at the appropriate hour. Is Talia driving to the corner store when she should be walking? Well, her auto tracker knows, as does her step counter, which means her insurance providers know, too. Maybe she can get away with it early in the month, if she has time to make up her activity quota before the 31st. Resources for healthcare are so constrained that people must demonstrate through data that their personal routines and lifestyle choices optimize preventative health protocols.

The Nudge Unit is constantly looking for new ways to incorporate behavioral triggers and feedback loops into online education and VR platforms, too. Buzz, buzz, a text appears. “Cam needs more points on Skyward Skills. It’s time to log on.” Or the pulse monitor indicates Li is too tense. Buzz, buzz, “Take a mindfulness break kid,” breathe and reflect. Buzz, buzz, “Talia step away from the screen and walk around the block to avoid blood clots.” Action triggered, data logged, repeat has turned life into one unending Pavlovian experiment.

Existence has subtly shifted to align to the Solutionist outlook. Economic forecasters rely on people being rational actors as they develop financial projections, and if technology can be used as a tool to shape human behaviors and enforce “rationality,” it is all the better for the global financiers who generate their wealth by speculating on the lives of everyday people. For the strivers, optimization has erased freedom and personal agency.

In the post-labor era, people have become more valuable for the data they produce than for their capacity to do physical work. Thus all but the off-liners have been integrated into the global corporate value chain as commodities. With biometrically-enabled Citi Badges, Cam and Li are not unlike tagged calves or farmed salmon, managed and processed without agency or recourse; lives controlled for the profit of others. The bio capitalist economic model values them only to the extent that they contribute their digital labor to the Solutionists’ data-driven system of outcomes-based results.

Algorithms hold tremendous power over Cam and Li. Using data generated through the Internet of Things, Oracle can make predictions about the type of adults the children are likely to become. What their cost to society will be. What they might contribute as human capital. Should their family should fall into poverty, Oracle can evaluate how much profit there could be made providing services to “impact” their situation through Pay for Success contracts. Would the predicted rate of return on their lives justify expending the Global Coin required? The Solutionists say, “Just run the data; the data will tell us.”

Talia tries to shelter the family from the data stream as much as possible, but that is has proven difficult. Accessing any public services demands data. Walking outside means you are under surveillance. Even at home devices keep tabs. Data has also become a currency people use to supplement their insufficient Global Coin stipends. The pretense that a person “owns” their own data and can monetize it is supposed to make them feel better about their situation. It doesn’t. Each data transaction puts another piece of one’s soul on the auction block, scrutinized by a predatory system that thrives on want and suffering. And it’s always a buyer’s market. No person in need is going to get ahead selling bits of data. These transactions are just stopgaps until the next Citi Badge stipend hits, a release valve that has thus far kept rebellion at bay.

At first the sensors seemed innocuous, uploading information about when a trashcan was full or telling people where parking spots were available. There were sensors that monitored air quality and ones that made sure streetlights were efficiently managed. People were enthusiastic. But then came the noise sniffers, and the motion sensors, and the drones. Parks and recreation officials were brought on board and encouraged to incorporate cyborg roses into public landscape projects. When first introduced, people were astonished at Eleni Stavrinidou’s work transforming plants into transistors, and now there were rumors of computational forests being grown in remote outposts. Once plants had sensors, people started to get really worried.

Teachers never imagined how sensors would alter classrooms and eventually eliminate them altogether. Adoption of 1:1 devices eroded teacher autonomy until students were spending most of their day with volunteer aides, eyes glued to screens. The teachers that remained were left evaluating student data. In classes where teachers were still allowed to lecture, movement, vibrations and sounds were monitored through sensors embedded in seats. The aim? Supposedly to provide continual feedback regarding student engagement and quality of instruction, but everyone knew it was really to keep track of the content delivered and how students responded. It was chilling.

By that point, the last remaining veteran teachers abandoned the profession. Eventually teacher shortages, austerity budgets, and the corporate education lobby’s campaign for “anytime, anywhere” learning ushered in IoT-enabled learning ecosystems. No one had invested in public education infrastructure for years. Sending everyone home with a device meant there was no longer the expense of feeding poor children. Students too young to stay at home and whose parents were working strivers were packed off to community partners. These partners had been carefully prepared for their role providing standards-aligned summer and out of school time programs. Plus this approach brought education completely under the umbrella of social impact investing, which pleased the financiers. All in all it was a pretty seamless transition. Given how punitive the instruction had become, most felt a sense of relief when the time came to phase out schools entirely.

Ten years out Cam and Li, like the characters in Isaac Asimov’s short story The Fun They Had, have no idea what “going to school” means. Some nights before turning out the lights, Talia tells the girls stories that give them a glimpse into that past. Yet, it is so far removed from their reality that neither can imagine what it must have been like to learn with a group of other kids. To have a human teacher and books, and go to a school building and spend the day there is a frightening prospect. People live isolated lives. Encounters with others are carefully managed. To spend a full day as part of a group, talking no less, seems a perilous and fraught enterprise.

Now everyone is assigned an Artificial Intelligence (AI) “assistant,” a lifelong learning guide when they receive their first education voucher. Cam tolerates hers, but Li is another story. They have quite the adversarial relationship. Li accuses her AI of giving her assessments that underestimate her actual ability, so she has to spend days and days going over material she already knows. Her games are always shorting out at a critical moment, right before her points are logged. The algorithm gives her essays failing marks, even though her mom and Grandpa Rex both say she has a gift for creative writing. Cam says that because the companies are rolling out so many new programs, glitches just going to happen and to not take it personally. People have always had frustrations with their devices, from autocorrect fails to systems freezing unexpectedly, but now that devices control so much more of people’s lives their faults are harder to tolerate. Talia often finds herself having to get up from her work and do a hard shutdown of Li’s tablet to give them both a time out.

The AI conversational agents and the platforms that host them employ a variety of tactics to ensure that Cam, Li, and all the children remain on task. Devices record ISPs and timestamps for logins. Keystroke and facial recognition data is stored, too. Wearable and biometrics are part of the equation. The early headbands and wristbands were incredibly clunky, but five years in they switched to IoT temporary tattoos with sleek designs that prominently identify each child’s designated pathway and rank.

It’s a major milestone when a student attains enough credentials in their portfolio to upload and claim a pathway. The tattoos, not unlike military insignia, help communicate social order and expected etiquette when new people meet. A picture is worth a thousand words, and in a culture that is increasingly non-verbal, a pathway tattoo is an important tool.

To maintain order, the Solutionists knew behavioral engineering had to become central to the educational system. With little meaningful work, systematic mental health training was needed. They wanted people neither too depressed nor too rebellious. Resilience, and grit were traits instilled through apps and gamification; children’s mindsets tracked as closely as the knowledge they acquired. The system was calibrated to identify mental disorders and dissidents early, flagging them for intervention. Both Cam and Li knew kids who had been forcibly plugged into remediation, but it wasn’t discussed openly.

The isolation that resulted from cyber education took a toll on many. Social networks withered. Kids rarely spent time with friends face-to-face. Text-support only went so far in beating back the darkness. Suicide rates climbed, affecting younger and young children. Programmers scrambled to develop new monitoring procedures. The Global Well Being Program was a leader in the field, their cutting-edge algorithms effective, but expensive.

Despite the high cost, sector education officials from all but the poorest communities debited funds for the monitoring service directly from student vouchers to cover the cost. Timely intervention was a matter of life or death, and people were willing to pay. In the post-labor world, monitoring and treating depression was a growth market. Before long tele-therapy and mental-health VR surged past bio-pharmaceuticals as darlings of the venture capital investment crowd.

By 2025 most major and mid-size cities had become “smart cities,” integrating IoT sensors into a wide variety of infrastructure projects. In doing so, officials created a ubiquitous layer of surveillance across the public sphere. Now, in order to access communal spaces, residents had to acquiesce to being watched. Management of the complex IoT systems required expertise far beyond the in-house capacity of most cities; as a result, outsourcing to global corporations became commonplace.

Over time, voters found they had less and less voice in government. Officials kept up appearances for several election cycles, but it became obvious that technology companies like Sysko were really the ones in charge. People wanted to believe elections still mattered. The history modules made a point of expressing how hard people had fought for the right to vote and to fix problems like gerrymandering, but it the years leading up to lockdown it became a hollow exercise. Talia had memories as a teen of the media stirring up outrage over voting irregularities. Looking back, they should have realized something was amiss. The solution to this “problem” was to switch to voting on the Blockchain using Citi Badges. Of course that shift effectively shut all of the off-liners, those who had no badge, out of the process.

Democracy was exposed for the charade it had always been, and it became clear to all that they had been living under fascism for a very long time. The cloud-based computing, telecommunication, and global finance interests united under the Solutionist banner and ensured authoritarian control was firmly in place. Global law enforcement working through the Blockchain Collaborative backed the technocrats in their coup. Now for Cam and Li, voting was a topic touched upon briefly in history modules where it was framed as a messy process no longer suited to the well-structured, transparent society the Solutionists had devised.

As the end game neared, secure and exclusive sanctuaries modeled after billionaire and media mogul Richard Braddock’s island home began to appear. He was among the first to bring world thought leaders together to discuss ways to build and scale Blockchain applications. These thought leaders sold everyone a utopian vision of trust, transparency and collective support. Those purported values fell by the wayside, though, shortly after the lockdown.

People with knowledge of edge computing, IoT, and Blockchain deployment and who had the money constructed sensor free zones to which they could retreat. Of course kids like Cam and Li will never be able to obtain access to such sanctuaries. That world is limited to families that can afford the astronomical costs of having human teachers for their children, whose social networks are such that they don’t need citizen scores or e-portfolios to assert their value to society. Sometimes Cam and Li wonder about the sanctuary kids. Surely there aren’t many of them. Are they lonely? Do they feel isolated, too? Are they glad to be unplugged? Do they know about life on the outside, life on the ledger?

Continue to Part 4: Data Mining Life on the Ledger

Supplemental Links

Internet of Things IBM: Link

History of IoT Sensors: Link

What is Blockchain: Link

Supply Chain IoT: Link

Cash VS Digital Economy and Online Payments: Link

Sidewalk Labs: Link

Smart Cities / Noise Sniffer: Link

IoT and Predictive Policing: Link

Police Body Cameras and AI: Link and Link

Patrol Robots: Link

Street Lights and IoT: Link

IoT Parking: Link

Storm water IoT: Link

Smart Trash Cans: Link

Sensors and Smart Cities: Link

Cognitive Drones: Link

Cyborg Roses: Link

Internet of Battlefield Things: Link

Pay for Success and Big Data: Link

Blockchain Social Impact Token: Link

Human Capital Analytics: Link

Nudge Unit: Link and Link

Game Theory, Human Resources and Social Skills: Link

AI Nudge Bots: Link

Behavior Change for Good: Link

Haptic Devices: Link

Rational Choice and Behavioral Economics: Link

Education and Biocapitalism: Link

Behavioral Science and Social Impact: Link

Making Behavior Change Stick: Link

IoT Classrooms: Link

Sensors Determining Education Quality: Link

Affectiva Emotion Sensing Software: Link

Behavioral Biometrics: Link

World Well Being Project: Link

The Fun They HadLink

Device Use Behavior Tracking in Education: Link

Virtual Agents / USC Institute of Creative Technologies: Link

AI Conversational Agents / Amelia IP Soft: Link and Link

AI Teaching Assistant: Link

Conversational Agents / Articulab: Link

Applied Gaming and Mental Health: Link

Brainwave Data Collection: Link

IoT Tattoos / Duoskin: Link

Pathways to Prosperity / Jobs for the Future: Link

Characterlab / Grit: Link

CASEL / Social Emotional Learning: Link

Serious Games and Mental Health: Link

Government as Platform: Link and Link

IBM Smart Cities: Link

Cisco Smart Cities: Link

New York Smart City: Link

Blockchain Voting: Link

Neckar Island Blockchain Summit: Link

Edge Computing: Link

Blockchain Cryptoeconomics: Link

Blockchain Alliance: Link

-Alison McDowell

Data Unicorns? Tech Giants and US Dept of Ed Form Alliance to Leverage Student Data — Without Parent Consent.

Reposted with permission from Missouri Education Watchdog

Leveraging Student Data

Project Unicorn: Billionaire partners promoting data interoperability and online “Personalized Learning”

When the Unicorns “protecting” student data are interoperable with the Unicorns taking it, parents and lawmakers might want to pay attention.

According to Technopedia, in the Information Technology world, “a unicorn is most commonly used to describe a company, for example, a Silicon Valley startup, that started out small but has since increased its market capitalization to, say, $1 billion or more. …For example, the social media giant Facebook, which has a market capitalization of more than $100 billion, is considered as a “super-unicorn among unicorns”.  Interesting coincidence because the name of a MEGA financed K-12 student data alliance is a unicorn.

Meet Project Unicorn.

Project Unicorn’s Mission is to Leverage Student Data and Make Data Interoperable

Project Unicorn

Project Unicorn’s steering committee is a who’s-who of edtech bundlers, billionaires, and student data power-players. They have formed an “uncommon alliance” committed to leveraging student data by making the data interoperable, flowing seamlessly, between all K-12 applications and platforms. While addressing student data security and privacy is a much needed conversation, it would seem that Project Unicorn has the cart before the horse. There is no talk of student data ownership or consent prior to collecting and using student data but rather, per this press release, Project Unicorn will continue to take the data, make data interoperable and talk about it afterwards, “Once interoperability is in place, we can start working with teachers and students to ask questions about the data.”  You can see by tweets below that Project Unicorn initially claimed it wanted to “shift data ownership to the student”; they have since withdrawn that statement.  Several schools and districts have been encouraged to join the Project Unicorn Coalition; we wonder if parents in these schools were given an option or are even aware of what this means. We’re going to talk about a few of the Project Unicorn partners and then circle back to their interoperability goals and how that fits with student data ownership, ethics, and the newly formed and related Truth About Tech and Humanetech.

A few points before we start:

  • When it comes to “free” edtech products, you know if it is free, you are the product; you pay with your data and your privacy. With edtech and 1:1 devices, personalized learning, online assessments online homework, LMS systems, students usually do not have a choice. Students do not have the ability to consent or opt out. Why?
  • Not all philanthropy is charity. As this article points out, for some, philanthropy is an investment, these nonprofits may “look” charitable but they are truly meant to make money and to buy power and influence policy, and sometimes do harm.
  • McKinsey Global estimated that increasing the use of student data in education could unlock between $900 billion and $1.2 trillion in global economic value. 
  • Children are not data points to predict, standardize and analyze. Currently online platforms can collect every key stroke, analyze and predict children’s behaviors. Children are not meant to be experimented on and#KidsAreNotInteroperable.
  • Currently, students’ data can be shared, researched, analyzed, marketed without parental consent. Often, parents cannot refuse the data sharing, cannot see the data points shared and how they are analyzed.
  • Edtech and Silicon Valley companies can gain access to personal student information without parent consent, under the School Official exception in FERPA. The US Department of Education not only promotes edtech companies, it tells tech companies HOW to gain access to student data, and is partnered in this project to make data sharing interoperable.
  • Interoperable data systems will allow even larger, very predictive data profiles of children–everything they do, are. The best way to protect privacy is to not collect data in the first place. Interoperability, with bigger and more detailed, sensitive data sets, sharing and mixing data with third parties is risky for both privacy and security. The US Department of Education has already warned of cyber hackers ransoming sensitive data from schools; who will be responsible and liable for more data breaches?

Back to unicorns.

How is the US Department of Education involved with Project Unicorn? 

The USDoE (your tax dollars) has been a major driving force of funding and support in online education, and data interoperability. Part of the data interoperability requires common data standards. CEDS (Common Education Data Standards) are codes used to tag student data, you can see these over 1,700 different data codes or elements, in the federal student data dictionary.  These common data tags were created with the help of  Bill Gates, funder of the Data Quality Campaign; read about the mission of DQC at the US Department of Education Summit here. Data Quality Campaign also provides policy guidance to legislators and education agencies, such as this 2018 DQC Roadmap promoting Cross-Agency data sharing. With the shift in education focusing more on workforce talent pipelines (see both ESSA and WIOA), the Workforce Data Quality Campaign (Gates, Lumina, Arnold, Joyce Foundation funded) has also influenced the US Department of Labor. The US Department of Labor-Workforce Data Quality Initiative plans to use personal information from each student, starting in pre-school, via the states’ SLDS data system. You can read more about  the SLDS, the roles that the US Department of Education and Bill Gates play in student data collection, the weakening of federal privacy law FERPA  here. In recent years Microsoft’s commitment to data privacy has been called into question, as per this EdWeek article. Even Microsoft itself admits they can take a peek and trend through student data and can put it on the market.

“If students are using certain cloud infrastructures, and it’s held by a third party, it is possible for [the vendors] to trend through the data,” said Allyson Knox, director of education policy and programs for Microsoft. “When [information] is flowing through a data center, it’s possible to take a peek at it and find trends and put it on the market to other businesses who want to advertise to those students.”

Knox said Microsoft has a “remote data center” where student information is housed but that “students’ data belongs to them.” -Microsoft https://www.fedscoop.com/lawmakers-hear-testimony-on-student-data-and-privacy/                     

Does Microsoft still believe that student data belongs to the student?

Gates: In 5 Years

Microsoft, Bill and Melinda Gates Foundation

The Bill and Melinda Gates Foundation is a nonprofit whose IRS 990 forms can be seen here and (2016) here and TRUST here; their awarded grants can be seen in this searchable database. Gates spends billions on K-12 and higher ed reform. Gates (and Data Quality Campaign) both support a national student database, and now Gates is shifting his Multi-Billion focus from Common Core to K12 networks and curriculum.

(See With new focus on curriculum, Gates Foundation wades into tricky territory .)

Microsoft is desperately hoping to regain ground in the K-12 classroom 1:1 device market, with management systems, cloud, gamification of education (yes, Microsoft owns Minecraft and is promoting Minecraft in classrooms), K-12 LinkedIn Data Badges (yes, Microsoft owns LinkedIn-and yes there are LinkedIn K-12 badge pilots in AZ and CO), introducing chatbots and Artificial Intelligence into education and several online tools like Microsoft OneNote, favorably reviewed here by their unicorn partner Digital Promise. Microsoft is also part of the US Department of Education’s push for online curriculum, via Open Ed Resources OERs. Microsoft will be handling and indexing the content for the Federal Learning Registry. (You can read more about how the Federal Department of Defense and Department of Education are involved in OERs here.)

According to this December 2017 New York Times piece, Microsoft is fiercely trying to regain ground in the K-12 classroom market.

Tech companies are fiercely competing for business in primary and secondary schools in the United States, a technology market expected to reach $21 billion by 2020, according to estimates from Ibis Capital, a technology investment firm, and EdtechXGlobal, a conference company.

It is a matter of some urgency for Microsoft. 

Chromebooks accounted for 58 percent of the 12.6 million mobile devices shipped to primary and secondary schools in the United States last year, compared with less than 1 percent in 2012, according to Futuresource Consulting, a research company. By contrast, Windows laptops and tablets made up 21.6 percent of the mobile-device shipments to schools in the United States last year, down from about 43 percent in 2012. – https://www.nytimes.com/2017/05/02/technology/microsoft-google-educational-sales.html [Emphasis added]

Digital Promise

If you aren’t familiar with Digital Promise, it is a non-profit created by the US Department of Education, to PROMOTE edtech in the classroom. Read about Digital Promise and Global Digital Promise here. Digital Promise is demanding data interoperability for school districts. Digital Promise presented their report The Goals and Roles of Federal Funding for EdTech Research at this 2017 symposium  which was funded by tech foundations and corporations, such as Bill and Melinda Gates, Chan-Zuck, Strada, Pearson, Carnegie… you get the idea.   In their report, Digital Promise acknowledges that the federal government has spent significant money on developing and disseminating technology-based products in the classroom with little to no information on how these products are working.  So, is the answer to rely on tech financed entities and unicorns to review and research the efficacy of future edtech products?  No conflict of interest there. Digital Promise also utilizes the heavily Gates funded and controversial Relay Graduate School, which you can read about here.

The Personalized Learning algorithm driven model does not work.

Digital Promise and others in edtech continue to push for online Personalized Learning despite many warnings from edtech insiders including this from Paul Merich, entitled Why I Left Silicon Valley, EdTech, and “Personalized” Learning. Merich’s concerns with the algorithmic driven Personalized Learning, are summed up with this quote,

“It was isolating with every child working on something different; it was impersonal with kids learning basic math skills from Khan Academy; it was disembodied and disconnected, with a computer constantly being a mediator between my students and me.”

And in this piece by Rick Hess, A Confession and a Question on Personalized Learning, the CEO of Amplify admits Personalized Learning is a failure. We wish every policy wonk and educrat would read this:

…“Until a few years ago, I was a great believer in what might be called the “engineering” model of personalized learning, which is still what most people mean by personalized learning. The model works as follows:

You start with a map of all the things that kids need to learn.

Then you measure the kids so that you can place each kid on the map in just the spot where they know everything behind them, and in front of them is what they should learn next.

Then you assemble a vast library of learning objects and ask an algorithm to sort through it to find the optimal learning object for each kid at that particular moment.

Then you make each kid use the learning object.

Then you measure the kids again. If they have learned what you wanted them to learn, you move them to the next place on the map. If they didn’t learn it, you try something simpler.

If the map, the assessments, and the library were used by millions of kids, then the algorithms would get smarter and smarter, and make better, more personalized choices about which things to put in front of which kids.

I spent a decade believing in this model—the map, the measure, and the library, all powered by big data algorithms.

Here’s the problem: The map doesn’t exist, the measurement is impossible, and we have, collectively, built only 5% of the library.

To be more precise: The map exists for early reading and the quantitative parts of K-8 mathematics, and much promising work on personalized learning has been done in these areas; but the map doesn’t exist for reading comprehension, or writing, or for the more complex areas of mathematical reasoning, or for any area of science or social studies. We aren’t sure whether you should learn about proteins then genes then traits—or traits, then genes, then proteins.

We also don’t have the assessments to place kids with any precision on the map. The existing measures are not high enough resolution to detect the thing that a kid should learn tomorrow. Our current precision would be like Google Maps trying to steer you home tonight using a GPS system that knows only that your location correlates highly with either Maryland or Virginia.

We also don’t have the library of learning objects for the kinds of difficulties that kids often encounter. Most of the available learning objects are in books that only work if you have read the previous page. And they aren’t indexed in ways that algorithms understand.

Finally, as if it were not enough of a problem that this is a system whose parts don’t exist, there’s a more fundamental breakdown: Just because the algorithms want a kid to learn the next thing doesn’t mean that a real kid actually wants to learn that thing.

So we need to move beyond this engineering model…” — Larry Berger, CEO of Amplify, excerpt Rick Hess Straight Up Blog [Emphasis added]

 

And…Digital Promise just published a 2018 report promoting “Personalized Learning”, co-authored by Tom Vander Ark, here.  In this report you can find such gems as this global mantra (including in the US) that learning and teaching knowledge is no longer the main goal of education, it is more important to gather data about how students think and feel.

According to the World Economic Forumthe top five most valued skills for workers in 2020 are: 1) complex problem solving; 2) critical thinking; 3) creativity; 4) people management; and 5) coordinating with others. This is a far cry from simply needing a grasp of reading, writing, and arithmetic to be marketable to employers. While mastery of the three Rs remains critical, it is merely the launching point and no longer the end goal. We need to re-think the education system”  –US Department of Education’s Digital Promise http://digitalpromise.org/wp-content/uploads/2018/01/lps-policies_practices-r3.pdf

Getting Smart, Tom Vander Ark

Tom Vander Ark is Getting Smart author, creator and is the “director of 4.0 Schools, Charter Board Partners, Digital Learning Institute, eduInnovation, and Imagination Foundation, and advises numerous nonprofits.” Vander Ark was also the former Executive Director of Education for Microsoft.  Vander Ark, in this 2011 video said that Common Core’s mandate of online assessments could be used as a lever to get computers into the classroom, computers for personalized learning to help replace teachers. Tom Vander Ark also said gone are the “days of data poverty” once we use online formative tests rather than end of year high stakes tests. Vander Ark is also featured in this Global Education Futures conference; notice that Vander Ark is speaking on how to Unbundle Billions in Education.

Dell Foundation.

What could Dell computers possibly have to do with tech in schools and student data you ask? For starters, Dell funds some heavy hitters in data analytics, such as McKinsey and Boston Consulting Group. Dell also has a “free” app for high school students called Scholar Snap, which handles students’ personal scholarship data. Interestingly, Scholar Snap is also partnered with the Common App, both of which are third party vendors within Naviance, a K-12 Workforce data platform. (You can read about Naviance and their data mining, including how Common App asks students to waive their FERPA rights by clicking here.) Additionally, Dell (along with Gates) helps fund CoSN, the makers of the (industry self-policing, self-awarding) Trusted Learning Environment Seal for Student Data. CoSN  also promotes data collection and personalized learning.  Their “data driven decision making mission” is to “help schools and districts move beyond data collection to use data to inform instructional practice and personalize learning“. Not surprisingly, CoSN is also co-author of this Horizon Report, touting the virtues of Virtual Reality (VR) and robotics and wearable tech, expected to be adopted in K-12 education within the next 3 to 5 years.

The wearable format enables the convenient integration of tools into users’ everyday lives, allowing seamless tracking of personal data such as sleep, movement, location, and social media interactions. Head-mounted wearable displays such as Oculus Rift and Google Cardboard facilitate immersive virtual reality experiences. Well-positioned to advance the quantified self movement, today’s wearables not only track where people go, what they do, and how much time they spend doing it, but now what their aspirations are and when those can be accomplished.”  –CoSN Horizon Report 2018

Side note: It’s not just students who will be required to track and share their biometric and personal data. As this New York Times piece reports, teachers in West Virginia were required to submit their personal information to a health tracking app or risk a $500 penalty.

They implemented Go365, which is an app that I’m supposed to download on my phone, to track my steps, to earn points through this app. If I don’t earn enough points, and if I choose not to use the app, then I’m penalized $500 at the end of the year. People felt that was very invasive, to have to download that app and to be forced into turning over sensitive information.

The Future of Privacy Forum

The Future of Privacy Forum, is a Project Unicorn partner and DC think tank funded by many tech foundations and corporations including but not limited to: Amazon, Apple, AT&T, Comcast, Facebook, Google, Microsoft, Verizon, Samsung, Sidewalk Labs (Google’s Alphabet, Smart Cities), Walt Disney, Bill & Melinda Gates Foundation, National Science Foundation. Hobsons (Naviance), Intel, Palintir, Pearson, Netflix, Mozilla name only a few of their big name supporters. Their K12  arm focuses on balancing student data privacy while supporting innovation and technology in the classroom.

New technologies are allowing information to flow within schools and beyond, enabling new learning environments and providing new tools to improve the way teachers teach and the way students learn. Data-driven innovations are bringing advances in teaching and learning but are accompanied by concerns about how education data, particularly student-generated data, are being collected and used.

The Future of Privacy Forum believes that there are critical improvements to learning that are enabled by data and technology, and that the use of data and technology is not antithetical to protecting student privacy. In order to facilitate this balance, FPF equips and connects advocates, industry, policymakers, and practitioners with substantive practices, policies, and other solutions to address education privacy challenges.

While it is fantastic to have such a well-funded group concerned about student privacy, we wish they would go further. The Future of Privacy Forum  doesn’t advocate for student and parent consent before taking or using student data, nor do they say students should own their own data. We wish they advocated for the right of parents to be ensured paper pencil / book / human face to face teacher alternatives to online curriculum.  We also wish that Future of Privacy Forum would better highlight that predictive algorithms are not regulated or transparent; meta data and personalized, adaptive learning are exempted from state privacy laws, often with this or very similar language:

Nothing in this section

And though the Future of Privacy Forum does promote technology in the classroom, screen addiction is a concern for parents. (Although tech addiction has seen increased media coverage as of late, it’s not new; see this 2015  New York Times article on the toll that screen addiction has on children. However, surprisingly, some would still argue that tech is not addictive. ) When promoting technology in the classroom, the Future of Privacy Forum could do a better job addressing the many well-documented health risks of screen use including behavioral changes, link to teen depression and suicide, sleep disturbance, damage to retinas and vision loss, and better highlight guidance from the American Academy of Pediatricians, warning that wireless devices and cell phones can cause cancer.

Common Sense Media

Common Sense Media is a nonprofit who is supported by several foundations, including but not limited to: The Bezos (Amazon) Family Foundation, The Bill and Melinda Gates Foundation, The William and Flora Hewlett FoundationCarnegie Corporation of NY,  Eli and Edythe Broad Foundation, Michael & Susan Dell Foundation,Overdeck Family Foundation, R.K. Mellon Foundation Symantec ,The Anschutz Foundation,  Annie E. Casey Foundation.  Another of their investors states that, “Common Sense Media provides unbiased and trustworthy information about media and entertainment that helps parents and children make informed choices about the content they consume.”

Can Project Unicorn or any of its Partners truly claim to be unbiased, since they are funded by the data driven tech industry? Since they are in a position to inform and advise on education policy, this is an important question.

Common Sense Media, even after hosting an event about tech addiction, see Truth About Tech below, is still advocating that only certain screen time exposure is addictive or concerning. Common Sense says when it comes to screen time, “there really is no magic number that’s “just right.”   Parents would argue that while content is certainly important, addiction, retinal damage, cancer risk, permissionless data collection, online safety risks apply to both educational and non-educational screen time, and affect children regardless of digital content.

Common Sense Tweet

To their credit, Common Sense Kids Action recently hosted a full day conference (video) on “Truth About Tech– How tech has our kids hooked.” It is great to get this conversation into the spotlight , you can see the agenda here, but there was no mention of giving students and parents ownership and control of how student data is collected, analyzed and shared. With online personalized learning and 1:1 devices being pushed at students as early as kindergarten and preschool, and no laws regulating meta data, data analytics, hidden algorithms, limiting screen time in schools and consent for data collection should have been discussed. Instead, Common Sense along with Project Unicorn is focused on data interoperability to keep the K-12 data flowing and will continue to ask parents to better control children’s screen time use at home.

Common Sense YouTube

The last segment of Common Sense’s Truth About Tech event, entitled “Solutions for Families, Schools, and Democracy” was moderated by Rebecca Randall, Vice President of Education Programs, Common Sense with guest speakers and Common Sense partners Dr. Carrie James, research associate, Project Zero, Harvard School of Education,, and Randima Fernando, Center for Humane Technology. This entire piece is worth your time, Mr. Fernando had some excellent points on gaming and technology.  However, we are going to focus on Dr. James’ comments since, as Ms. Randall mentions, it is on Dr. James’ work regarding digital ethics that Common Sense bases their K-12 digital literacy and citizenship curriculum.  Common Sense Media is about to begin working again with Dr. James and Harvard’s Project Zero to develop updated K-12 digital guidance.

At 49 minute mark,  Dr. James remarks:

“In answering a question around parents as role models, responded that, “We have a growing pile of evidence to suggest that parents are not doing a great job in this regard in recent research that we’re doing with Common Sense we’ve reached out to schools and teachers across the country and in a couple of countries around the world and asked you know what are some of the most memorable digital challenges your schools have faced and a surprising number of them have to do with parents.”

With screens being so addictive, we agree that many parents and most of society undoubtedly could be better screen time role models, we disagree with Common Sense’s continued emphasis only on non-educational screen use. We hope that Common Sense, their partners at Harvard Project Zero who will be working on new digital literacy citizenship curriculum, will consider age appropriate screen use, health and safety guidelines, parental consent and data ownership for children using devices and screens for educational purposes, including online homework. Parents send their children to school expecting them to be safe. Many parents do not want their children required to use screens and technology for regular coursework and when learning core subjects.  Many parents are uncomfortable with online personalized learning and would prefer face to face human teachers and text books as an option. The cost of attending public schools should not be mandatory screen exposure and loss of privacy. We hope that Common Sense will address these concerns in their work.

Project Unicorn is Promoting Interoperability. What is it?

An April 2017 Clayton Christensen Institute blog posted on the Project Unicorn news website explains the path of data interoperability as this,

“The first path toward interoperability evolves when industry leaders meet to agree on standards for new technologies. With standards, software providers electively conform to a set of rules for cataloging and sharing data. The problem with this approach in the current education landscape is that software vendors don’t have incentives to conform to standards. Their goal is to optimize the content and usability of their own software and serve as a one-stop shop for student data, not to constrain their software architecture so that their data is more useful to third parties.

Until schools and teachers prioritize interoperability over other features in their software purchasing decisions, standards will continue to fall by the wayside with technology developers. Efforts led by the Ed-Fi Alliance, the Access for Learning Community, and the federal government’s Common Education Data Standards program, all aim to promote common sets of data standards. In parallel with their (sic) these efforts, promising initiatives like the Project Unicorn pledge encourage school systems to increase demand for interoperability.”  [Emphasis added] https://www.christenseninstitute.org/blog/making-student-data-usable-innovation-theory-tells-us-interoperability/

A one-stop shop for student data, flowing seamlessly for third parties: Interoperability. 

How will  Project Unicorn help give students ownership of their data? Will students have consent and control over their data? We asked. 

Interestingly, up until a few days ago, Project Unicorn’s twitter profile stated that their focus is “shifting the ownership of data to schools and students.” See this screenshot from February 18, 2018 and a twitter conversation below.

Project Unicorn Tweet 2Project Unicorn replied the following day but they did not immediately answer my question about student data consent and ownership. Instead, they listed a few of their partners: Data Quality Campaign, Future of Privacy, Common Sense Media, National PTA. Again, I asked them about their statement about shifting ownership of data to the student.

Project Unicorn Tweet 3

Project Unicorn Tweet 4

Gretchen Logue also replied to Project Unicorn and their partners, asking if students can NOT have their data shared. Two days later, she still had not received a reply.

Logue

I directly asked Project Unicorn’s partner, Digital Promise to help answer whether students can consent to data collection. (Remember, DP is the edtech /personalized learning promoting non-profit created by the US Department of Ed.)  Digital Promise never responded to this parent’s questions. Maybe they just need a little more time or maybe parents aren’t important enough to bother with?

Tweet 5

tweet 6

tweet 7

Project Unicorn replied: they changed their twitter profile to better reflect the scope of their projectThey no longer claim to shift data ownership to students. They are promoting data interoperability. To be clear: they are NOT giving students ownership of their data. See their new twitter profile in this February 23, 2018 screen shot below.

Project Unicon interoperability

Why do edtech companies and our government have such a problem giving students consent and true ownership of their data? Data is money. Data is identity.  Student data is NOT theirs to take. 

Without the student, the data does not exist. If a student writes an essay for a class assignment, that written work belongs to the student. If a student draws a picture in art class, that artwork is theirs. Parents (and the Fourth Amendment) would argue that personal information about a student, created by a student, should belong to the student.

#TruthinTech: Unicorns are taking student data and sharing it without consent. What say you @HumaneTech?

Humane tech

Tech is hacking kids brains, but it is also stealing their data, students’ every keystroke can be collected and analyzed and student education records can be shared.  (FERPA is a 40 year old law that doesn’t cover data or meta data, or algorithms and was substantially weakened  in 2011 to allow personally identifiable information to be shared outside of the school with nonprofits, researchers, anyone approved as a school official or  educational purpose–without parent consent or knowledge). HumaneTech folks, are you good with this predictive profiling, leveraging and capitalizing of children who are held hostage in this mandatory surveilled school system? Schools are the new smart cities –except children are a captive audience and they are being exploited. They have no choice.

Why not do real, independent research, set guidelines and protect kids from screens in schools? Why not give parents and students a choice of tech vs paper, allow the option of learning knowledge vs in-school personality surveys and emotional assessments and biometric health trackers? Why not be transparent about algorithms and analytics and get consent BEFORE collecting and using student or teacher data?

GDPR.

Europe requires consent before collecting and sharing personal data, including automated decision making. GDPR gives Europeans (including students) more control on how their data is handled, including breach notification and penalty, data redaction, and consent. Why would American students be any less deserving than students in Europe? GDPR will have global implications.  Modernizing FERPA and COPPA to align with GDPR would be both practical and ethical. Why isn’t Project Unicorn also advocating for the GDPR standard of basic human privacy and data identity rights for American citizens and children? 

A final question note. Project Unicorn is not an elected, governing body, are they directing US education policy? Decisions should be made democratically, by those closest to the children, instead of by a few billionaires. What gives philonthro-funders the right to leverage children’s data and encourage schools with their procurement $trategies? The Edtech Billionaires directing education-experimenting on children have created (and are profiting from) this data driven problem: teachers are so busy collecting endless data points they don’t have the time or the freedom to teach. Now the regretful tech industry, wants to swoop in and make the data collection process easier, free up teachers (or replace them?), with a Single-Sign-On Standardized data collection tool. Children are not a product to be leveraged.  Please stop using schools and children as a permissionless innovation data supply.

IMS Global

And why oh why, Project Unicorn, are you working with IMS Global?  Uncommon Alliance indeed.

“…interoperability specification for educational click stream analytics created by the education community for the education community. Major educational suppliers are using Caliper to collect millions of events every week and the data is helping to shape teaching and learning on multiple levels. Several leading institutions are also working on putting Caliper in place. Now is a great time for both institutions and suppliers to begin putting learning analytics in place using Caliper.”

IMS Global Learning Consortium

-Cheri Kiesecker