From Math to Marksmanship: Military Ties to Gamified Assessments

Reposted with permission from Wrench in the Gears.

HCeconomics

There is a difference between education and training. There is a difference between knowing just enough to carry out orders without questioning the chain of command and knowing enough to participate in civic life as a critical thinker. If educational-technology is an extension of military training/human engineering, which it is, we should give careful consideration as to what our society needs at this time, and if we should be allowing the military-industrial complex to data-mine and track our children’s innermost thoughts.

This past February, economist James Heckman convened a working group of social scientists to discuss new types of assessments that are being designed to capture data about children’s social-emotional traits and predict future behaviors. The researchers spent two days in an oak-paneled room at the University of Chicago where they collaborated on the new assessments and measurements. Impact investors, like Heckman’s patron JB Pritzker, need the metrics these tests will deliver to fuel their predatory, speculative pay for success schemes. Videos of the recorded presentations can be viewed here.

I will be excerpting segments of these talks on my blog, since I know most of you won’t have the time to sit through hours of viewing. This first segment highlights the intersection of educational technology and military training. For more information read one of my early pieces “How exactly did the Department of Defense end up in my child’s classroom?”

It is important to note that ReadyNation, sponsor of the Global Business Summit on Early Childhood, is a program of the Council for A Strong America. ReadyNation is their workforce development program. Another of the group’s five program areas is “Mission Readiness.” The website states this initiative is run by seven hundred “Retired admirals and generals strengthening national security by ensuring kids stay in school, stay fit, and stay out of trouble.”

There is a difference between education and training. There is a difference between knowing just enough to carry out orders without questioning the chain of command and knowing enough to participate in civic life as a critical thinker. If educational-technology is an extension of military training/human engineering, which it is, we should give careful consideration as to what our society needs at this time, and if we should be allowing the military-industrial complex to data-mine and track our children’s innermost thoughts.

Watch the clip here. Full talk here.

Timestamp 6 minutes 40 seconds

Jeremy Roberts (PBS Kids): I’ll hand it over to Greg. I wanted to give you a chance to talk about UCLA CRESST.

Gregory Chung (UCLA, CRESST) So, just quickly, you know what we bring to the project is expertise in the use of technology for measurement purposes. Whether it’s simulation or games. How do we turn that information about what we think is going on in their heads to their interaction with the game? So going through that whole analysis process from construct definition to behavior formation. And then just a general, we do research in a military context and in an education context, training, pre-k to adults. I joke that my motto is from math to marksmanship. (audience laughter)

Unidentified Audience Member: Can you say what the relationship is between the military and education?

Chung: Ah, it’s like…it is like… at a certain level they’re the same. Military training is about effectiveness. You train just enough to get someone to do some job. But integrated technology, adaptive systems give feedback. So all the instructional issues that you commonly apply to education, you apply to the military. But also you go from the military, who kind of created the whole instructional design system, back to education. And it’s really interesting when we have an intersection in say marksmanship, how do we measure skills (pantomimes shooting a rifle) with sensors, but then we bring in the educational assessment framework, like what’s going on in here (points to his head/brain), how that transfers to wobble and shake (points to torso).

Roberts: If the armed forces were to find out that say the students were not scoring sufficiently on the ASVAB to make them confident that they’d be able to operate the next generation of tank, for example, the army might be really interested in early childhood education.

Chung: (chuckling in audience) So, really they’re the same.

Heckman: It has, right? Already. And quite a few aren’t able to pass the ASVAB.

-Alison McDowell

Advertisements

How Big Data Becomes Psy Ops and Tilts the World Towards its Own Aims: Next Stop, Public Education

Reposted with permission from Educationalchemy.

640px-LudovicoMalcolmMcDowellAClockworkOrangetrailer
Ludovico technique apparatus – A Clockwork Orange

While “grit” has been exposed for the racist narrative it is, it’s also a direct by-product of the same OCEANS framework used to control, predict and manipulate voters. If this data can sway major national elections and change the global trajectory of history, imagine what such data, gathered on children, day after day, year after year, could yield for corporations and government interests.

The psy ops tactics used to get Donald Trump elected to the U.S. Presidency (still having gag reflex) are the same ones being used in public schools, using children as their “data” source. Given the power they had on influencing the electorate, imagine what they could do with 12 years of public school data collected on your child.

What data? And how was it used?

A psychologist named Michael Kosinski (see full report) from Cambridge developed a method to analyze Facebook members, using the cute little personality quizzes or games. What started as a fun experiment resulted with the largest data set combining psychometric scores with Facebook profiles ever to be collected. Dr. Kosinski is a leading expert in psychometrics, a data-driven sub-branch of psychology. His work is grounded on the Five Factors of Personality Theory which include something called OCEAN: openness, conscientiousnessextraversionagreeableness, and neuroticism.

So many people volunteered their personal information to play these games and take these quizzes that before long Kosinski had volumes of data from which he could now predict all sorts of things about the attitudes and behaviors of these individuals. He applied the Five Factors (Big Five Theory) model (well-known in psychometric circles) and developed a system by which he could predict very personal and detailed behaviors of individuals on a level deeper than had been accessed by prior models or systems.

Enter Cambridge Analytica (CA), a company connected to a British firm called SCL Group, which provides governments, political groups and companies around the world with services ranging from military disinformation campaigns to social media branding and voter targeting. CA indirectly acquired Kosinksi’s model and method for his MyPersonality database without his consent.

Then, CA was hired by the Trump team to provide “dark advertising” that would sway undecided people toward a Trump vote. CA was able to access this data to search for specific profiles: “all anxious fathers, all angry introverts, for example—or maybe even all undecided Democrats.” See motherboard.vice.com/en_us/article/big-data-cambridge-analytica-brexit-trump

Steve Bannon sits on the board for Cambridge Analytica.

“We are thrilled that our revolutionary approach to data-driven communication has played such an integral part in President-elect Trump’s extraordinary win,” Alexander James Ashburner Nix was quoted as saying. According to Motherboard, “His company wasn’t just integral to Trump’s online campaign, but to the UK’s Brexit campaign as well.” In Nix’s own words, it worked like this: “At Cambridge,” he said, “we were able to form a model to predict the personality of every single adult in the United States of America.”

The report continues, “according to Nix, the success of Cambridge Analytica’s marketing is based on a combination of three elements: behavioral science using the OCEAN Model, Big Data analysis, and ad targeting. Ad targeting is personalized advertising, aligned as accurately as possible to the personality of an individual consumer.” Then these same consumers receive “dark posts”-or, advertisements specifically devised for them, and that cannot be viewed by anyone else other than that person.

Where did the Big Five Theory come from?

Dr. Raymond Cattell is regaled in Western culture for his so called notable contributions to the field of intelligence assessment (IQ and personality work). Despite his direct and profound relationship to the eugenics movement and his recognition by the Nazi Party for the birth of The Beyondists, his work is benignly promoted in scholarly circles. But the fact that he is professionally legitimized does not make him any less the racist he was. And his contributions toward racist practices live on. He has two notable theories of personality development and measurement entitled The Big Five Theory and the Sixteen Personality Factor Questionnaire (16PF).

The way that OCEANS Five Factors personality data from our students can be used:

The recent trend toward a “grit narrative,” hailed by Angela Duckworth and others, has been gobbled up by school districts around the country. The OCEANS model is used widely by schools and other institutions internationally.

“The grit measure has been compared to the Big Five  personality model, which are a group of broad personality dimensions consisting of openness to experience (aka openness), conscientiousnessextraversionagreeableness, and neuroticism.”

(citation: Cattell, R. B.; Marshall, MB; Georgiades, S (1957). “Personality and motivation: Structure and measurement”. Journal of Personality Disorders19 (1): 53–67. doi:10.1521/pedi.19.1.53.62180PMID 15899720. )

There is a growing emphasis on the “affective” learning of students. Some examples include: “ETS’ SuccessNavigator assessment and ACT’s Engage College Domains and Scales Overview … the broader domains in these models are tied to those areas of the big five personality theory.”

Also see Empirical identification of the major facets of Conscientiousness 

While “grit” has been exposed for the racist narrative it is, it’s also a direct by-product of the same OCEANS framework used to control, predict and manipulate voters. If this data can sway major national elections and change the global trajectory of history, imagine what such data, gathered on children, day after day, year after year, could yield for corporations and government interests.

Watch the video from Jesse Schell, gaming CEO, to see exactly where this can go. As Schell says “your shopping data is a goldmine” and it’s only a matter of time before gaming companies and gaming behavior interface with our daily consumer and behavioral choices. You can get points for simply brushing your teeth long enough when product brands partner with gaming systems.”

We now have, thanks to perpetual assessments of children’s knowledge affective “grit” or personality, “the concept of the ‘preemptive personality,” the endlessly profiled and guided subject who is shunted into recalculated futures in a system that could be characterized as digital predestination.”

The role of education technology (aka “personalized learning”):

According to a report entitled Networks of Control: “Jennifer Whitson (2013) argues that today’s technology-based practices of gamification are ‘rooted in surveillance’ because they provide ‘real-time feedback about users’ actions by amassing large quantities of data’. According to her, gamification is ‘reliant on quantification’, on ‘monitoring users’ everyday lives to measure and quantify their activities’. Gamification practices based on data collection and quantification are ‘leveraging surveillance to evoke behavior change’ … While self-quantification promises to “make daily practices more fulfilling and fun” by adopting ‘incentivization and pleasure rather than risk and fear to shape desired behaviours’, it also became ‘a new driving logic in the technological expansion and public acceptance of surveillance’.

(See Wrenching The Gears for more readings on this issue)

Badges Find Their Way to San Jose, Philadelphia (and the Point Defiance Zoo)

Reposted with permission from Wrench in the Gears.

LRNG playlist

 

In this brave, new world education will no longer be defined as an organic, interdisciplinary process where children and educators collaborate in real-time, face-to-face, as a community of learners. Instead, 21st century education is about unbundling and tagging discrete skill sets that will be accumulated NOT with the goal of becoming a thoughtful, curious member of society, but rather for attaining a productive economic niche with as little time “wasted” on “extraneous” knowledge as possible. The problem, of course, is that we know our children’s futures will depend on flexibility, a broad base of knowledge, the ability to work with others, and creative, interdisciplinary thinking, none of which are rewarded in this new “personalized pathway/badging” approach to education.

San Jose LRNG Badges

Yesterday I watched a May 7, 2018 meeting held by the City Council of San Jose on education and digital literacy efforts related to the LRNG program, an initiative of the McCarthur Foundation-funded Collective Shift. Philadelphia is also a City of LRNG. Below is a five-minute clip in which they describe their digital badging program roll out.

Collecting an online portfolio of work-aligned skills is key to the planned transition to an apprenticeship “lifelong learning” model where children are viewed as human capital to be fed into an uncertain gig economy. Seattle Education’s recent post “Welcome to the machine” describes what is happening as Washington state follows the lead of Colorado and Arizona in pushing “career-connected” education.

Philadelphia’s LRNG program is called Digital On Ramps and is linked to WorkReady, the city’s youth summer jobs program. For the past several years children as young as fourteen have been encouraged to create online accounts and document their work experience using third party platforms. Opportunities to win gift cards and iPad minis have been used as incentives to complete the online activities. Within the past year the LRNG program has grown to include numerous badges related to creating and expanding online LinkedIn profiles. Microsoft bought Linked in for $26 billion in 2016. See screen shots below.

LRNG Contest

LRNG Contest 2

Below are excerpts from two previous posts I wrote about badges and Digital On Ramps. Activity is ramping up around online playlist education and the collection of competencies/badges using digital devices. We need to be paying attention. The first is from “Trade you a backpack of badges for a caring teacher and a well-resourced school” posted October 2016.

“This is not limited to K12 or even P20, the powers that be envision this process of meeting standards and collecting badges to be something we will have to do our ENTIRE LIVES. If you haven’t yet seen the “Learning is Earning” video-stop now and watch it, because it makes this very clear. Badges are representations of standards that have been met, competencies that have been proven. Collections of badges could determine our future career opportunities. The beauty of badges from a reformer’s perspective is that they are linked to pre-determined standards and can be earned “anywhere.” You can earn them from an online program, from a community partner, even on the job. As long as you can demonstrate you have mastery of a standard, you can claim the badge and move on to the next bit of micro-educational content needed to move you along your personalized pathway to the workforce.

In this brave, new world education will no longer be defined as an organic, interdisciplinary process where children and educators collaborate in real-time, face-to-face, as a community of learners. Instead, 21st century education is about unbundling and tagging discrete skill sets that will be accumulated NOT with the goal of becoming a thoughtful, curious member of society, but rather for attaining a productive economic niche with as little time “wasted” on “extraneous” knowledge as possible. The problem, of course, is that we know our children’s futures will depend on flexibility, a broad base of knowledge, the ability to work with others, and creative, interdisciplinary thinking, none of which are rewarded in this new “personalized pathway/badging” approach to education.

The reformers needed to get data-driven, standards-based education firmly in place before spotlighting their K12 badge campaign. Low-key preparations have been in the works for some time. In 2011, Mozilla announced its intention to create an Open Badges standard that could be used to verify, issue, and display badges earned via online instructional sites. The MacArthur Foundation and HASTAC (Humanities, Arts, Science, and Technology Alliance and Collaboratory) supported this effort. In 2013 a citywide badging pilot known as “The Summer of Learning” was launched in Chicago. 2013 was also the year that the Clinton Global Initiative joined the badge bandwagon. They have since agreed to incorporate badges into their operations and work to bring them to scale globally as part of the Reconnect Learning collaborative.

Other partners in the “Reconnect Learning” badging program include: The Afterschool AllianceBadge AllianceBlackboardDigital PromiseEdXETSHive Learning NetworksPearsonProfessional Examination Service and Council for Aid to Education, and Workforce.IO.

The Chicago Summer of Learning program expanded nationally and has since evolved into LRNG Cities, a program of the MacArthur Foundation. According to their website: “LRNG Cities combine in-school, out-of-school, employer-based and online learning experiences into a seamless network that is open and inviting to all youth. LRNG Cities connect youth to learning opportunities in schools, museums, libraries, and businesses, as well as online.”

In some ways such a system may sound wonderful and exciting. But I think we need to ask ourselves if we shift K12 funding (public, philanthropic, or social impact investing) outside school buildings, and if we allow digital badges to replace age-based grade cohorts, report cards, and diplomas, what are we giving up? Is this shiny, new promise worth the trade off? Many schools are shadows of their former selves. They are on life support. It is very likely that expanding the role of community partners and cyber education platforms via badging will put the final nail in the coffin of neighborhood schools.” Read full post here.

The second is from “Will “Smart” Cities lead to surveilled education and social control?” posted July 2017.

“Philadelphia has been on the Smart Cities’ bandwagon since 2011 when it teamed up with IBM to develop Digital On Ramps, a supposedly “ground breaking” human capital management program. As part of this initiative Philadelphia Academies, led at the time by Lisa Nutter (wife of Democrats for Education Reform former mayor Michael Nutter), developed a system of badges for youth that promoted workforce-aligned “anywhere, any time learning.” You can view a 2012 HASTAC conference presentation on the program starting at timestamp 50:00 of this video.  Lisa Nutter now works as an advisor to Sidecar Social Finance, an impact investment firm, and Michael Nutter is, among other things, a senior fellow with Bloomberg’s What Works Cities. This relationship map shows some of the interests surrounding the Digital On Ramps program. Use this link for an interactive version.

Digital On Ramps has since combined with Collective Shift’s initiative City of LRNG operating with support from the MacArthur Foundation. Besides Philadelphia, ten other Cities of LRNG are spread across the country: Chicago, Columbus, Detroit, Kansas City, Orlando, San Diego, San Jose, Sacramento, Washington, DC and Springfield, OH.

The premise is the “city is your classroom” where students “learn” through playlists of curated activities that are monitored via phone-based apps. Many of these cities are also “smart” cities. The Philadelphia program is presently housed at Drexel University, an institution that is involved in education technology research and development, that is a partner in Philadelphia’s Promise Zone initiative (education is a major component), and whose president John Fry served a term on the board of the Philadelphia School Partnership, the city’s ed-reform engine. Drexel’s graduate school of education is currently the lead on an unrelated NSF-funded STEM educational app and badging program being piloted with Philadelphia teachers in the Mantua neighborhood within the Promise Zone. It is touted as “an immersive, mentor-guided biodiversity field experience and career awareness program.”

In April 2017, Drexel’s School of Education hosted a lecture by DePaul University’s Dr. Nichole Pinkard entitled “Educational Technologies in a Time of Change in Urban Communities,” in which the MacArthur-funded 2013 Chicago Summer of Learning pilot was discussed. In this clip from the Q&A that followed the lecture, an audience member raised concerns about credit-bearing out-of-school time learning in the ecosystem model.

The 2011 IBM summary report for Digital On Ramps noted that among the four top priority recommendations was the creation of a “federated” view of the citizen in the cloud.” Of course, 2011 predates developments like Sesame Credit, but looking at it now I can’t help but conjure up an image of the “federated citizen in the cloud” as portrayed in Black Mirror’s dystopian Nosedive episode.

Digital On-Ramps appears to be a prototype for a career pathway, decentralized learning ecosystem model for public education. As the task-rabbit, gig economy becomes more entrenched with freelancers competing for the chance to provide precarious work at the lowest rate (see this short clip from Institute for the Future’s video about Education and Blockchain), what will it mean to reduce education to a series of ephemeral micro-credentials? And what dangers are there in adding behavioral competencies from predictive HR gaming platforms like Knack into the mix? Tech and human capital management interests are counting on the fact that people are intrigued by new apps. We’re predisposed to seek out pleasurable entertainment. Gamification is both appealing and distracting, consequently few people contemplate the downside right away, if ever.” Read full post here.

-Alison McDowell

Editor’s Note: The Point Defiance Zoo and Aquarium is a community partner with LRNG and offers badges. To learn more click here. -Carolyn Leith

Data Unicorns? Tech Giants and US Dept of Ed Form Alliance to Leverage Student Data — Without Parent Consent.

Reposted with permission from Missouri Education Watchdog

Leveraging Student Data

Project Unicorn: Billionaire partners promoting data interoperability and online “Personalized Learning”

When the Unicorns “protecting” student data are interoperable with the Unicorns taking it, parents and lawmakers might want to pay attention.

According to Technopedia, in the Information Technology world, “a unicorn is most commonly used to describe a company, for example, a Silicon Valley startup, that started out small but has since increased its market capitalization to, say, $1 billion or more. …For example, the social media giant Facebook, which has a market capitalization of more than $100 billion, is considered as a “super-unicorn among unicorns”.  Interesting coincidence because the name of a MEGA financed K-12 student data alliance is a unicorn.

Meet Project Unicorn.

Project Unicorn’s Mission is to Leverage Student Data and Make Data Interoperable

Project Unicorn

Project Unicorn’s steering committee is a who’s-who of edtech bundlers, billionaires, and student data power-players. They have formed an “uncommon alliance” committed to leveraging student data by making the data interoperable, flowing seamlessly, between all K-12 applications and platforms. While addressing student data security and privacy is a much needed conversation, it would seem that Project Unicorn has the cart before the horse. There is no talk of student data ownership or consent prior to collecting and using student data but rather, per this press release, Project Unicorn will continue to take the data, make data interoperable and talk about it afterwards, “Once interoperability is in place, we can start working with teachers and students to ask questions about the data.”  You can see by tweets below that Project Unicorn initially claimed it wanted to “shift data ownership to the student”; they have since withdrawn that statement.  Several schools and districts have been encouraged to join the Project Unicorn Coalition; we wonder if parents in these schools were given an option or are even aware of what this means. We’re going to talk about a few of the Project Unicorn partners and then circle back to their interoperability goals and how that fits with student data ownership, ethics, and the newly formed and related Truth About Tech and Humanetech.

A few points before we start:

  • When it comes to “free” edtech products, you know if it is free, you are the product; you pay with your data and your privacy. With edtech and 1:1 devices, personalized learning, online assessments online homework, LMS systems, students usually do not have a choice. Students do not have the ability to consent or opt out. Why?
  • Not all philanthropy is charity. As this article points out, for some, philanthropy is an investment, these nonprofits may “look” charitable but they are truly meant to make money and to buy power and influence policy, and sometimes do harm.
  • McKinsey Global estimated that increasing the use of student data in education could unlock between $900 billion and $1.2 trillion in global economic value. 
  • Children are not data points to predict, standardize and analyze. Currently online platforms can collect every key stroke, analyze and predict children’s behaviors. Children are not meant to be experimented on and#KidsAreNotInteroperable.
  • Currently, students’ data can be shared, researched, analyzed, marketed without parental consent. Often, parents cannot refuse the data sharing, cannot see the data points shared and how they are analyzed.
  • Edtech and Silicon Valley companies can gain access to personal student information without parent consent, under the School Official exception in FERPA. The US Department of Education not only promotes edtech companies, it tells tech companies HOW to gain access to student data, and is partnered in this project to make data sharing interoperable.
  • Interoperable data systems will allow even larger, very predictive data profiles of children–everything they do, are. The best way to protect privacy is to not collect data in the first place. Interoperability, with bigger and more detailed, sensitive data sets, sharing and mixing data with third parties is risky for both privacy and security. The US Department of Education has already warned of cyber hackers ransoming sensitive data from schools; who will be responsible and liable for more data breaches?

Back to unicorns.

How is the US Department of Education involved with Project Unicorn? 

The USDoE (your tax dollars) has been a major driving force of funding and support in online education, and data interoperability. Part of the data interoperability requires common data standards. CEDS (Common Education Data Standards) are codes used to tag student data, you can see these over 1,700 different data codes or elements, in the federal student data dictionary.  These common data tags were created with the help of  Bill Gates, funder of the Data Quality Campaign; read about the mission of DQC at the US Department of Education Summit here. Data Quality Campaign also provides policy guidance to legislators and education agencies, such as this 2018 DQC Roadmap promoting Cross-Agency data sharing. With the shift in education focusing more on workforce talent pipelines (see both ESSA and WIOA), the Workforce Data Quality Campaign (Gates, Lumina, Arnold, Joyce Foundation funded) has also influenced the US Department of Labor. The US Department of Labor-Workforce Data Quality Initiative plans to use personal information from each student, starting in pre-school, via the states’ SLDS data system. You can read more about  the SLDS, the roles that the US Department of Education and Bill Gates play in student data collection, the weakening of federal privacy law FERPA  here. In recent years Microsoft’s commitment to data privacy has been called into question, as per this EdWeek article. Even Microsoft itself admits they can take a peek and trend through student data and can put it on the market.

“If students are using certain cloud infrastructures, and it’s held by a third party, it is possible for [the vendors] to trend through the data,” said Allyson Knox, director of education policy and programs for Microsoft. “When [information] is flowing through a data center, it’s possible to take a peek at it and find trends and put it on the market to other businesses who want to advertise to those students.”

Knox said Microsoft has a “remote data center” where student information is housed but that “students’ data belongs to them.” -Microsoft https://www.fedscoop.com/lawmakers-hear-testimony-on-student-data-and-privacy/                     

Does Microsoft still believe that student data belongs to the student?

Gates: In 5 Years

Microsoft, Bill and Melinda Gates Foundation

The Bill and Melinda Gates Foundation is a nonprofit whose IRS 990 forms can be seen here and (2016) here and TRUST here; their awarded grants can be seen in this searchable database. Gates spends billions on K-12 and higher ed reform. Gates (and Data Quality Campaign) both support a national student database, and now Gates is shifting his Multi-Billion focus from Common Core to K12 networks and curriculum.

(See With new focus on curriculum, Gates Foundation wades into tricky territory .)

Microsoft is desperately hoping to regain ground in the K-12 classroom 1:1 device market, with management systems, cloud, gamification of education (yes, Microsoft owns Minecraft and is promoting Minecraft in classrooms), K-12 LinkedIn Data Badges (yes, Microsoft owns LinkedIn-and yes there are LinkedIn K-12 badge pilots in AZ and CO), introducing chatbots and Artificial Intelligence into education and several online tools like Microsoft OneNote, favorably reviewed here by their unicorn partner Digital Promise. Microsoft is also part of the US Department of Education’s push for online curriculum, via Open Ed Resources OERs. Microsoft will be handling and indexing the content for the Federal Learning Registry. (You can read more about how the Federal Department of Defense and Department of Education are involved in OERs here.)

According to this December 2017 New York Times piece, Microsoft is fiercely trying to regain ground in the K-12 classroom market.

Tech companies are fiercely competing for business in primary and secondary schools in the United States, a technology market expected to reach $21 billion by 2020, according to estimates from Ibis Capital, a technology investment firm, and EdtechXGlobal, a conference company.

It is a matter of some urgency for Microsoft. 

Chromebooks accounted for 58 percent of the 12.6 million mobile devices shipped to primary and secondary schools in the United States last year, compared with less than 1 percent in 2012, according to Futuresource Consulting, a research company. By contrast, Windows laptops and tablets made up 21.6 percent of the mobile-device shipments to schools in the United States last year, down from about 43 percent in 2012. – https://www.nytimes.com/2017/05/02/technology/microsoft-google-educational-sales.html [Emphasis added]

Digital Promise

If you aren’t familiar with Digital Promise, it is a non-profit created by the US Department of Education, to PROMOTE edtech in the classroom. Read about Digital Promise and Global Digital Promise here. Digital Promise is demanding data interoperability for school districts. Digital Promise presented their report The Goals and Roles of Federal Funding for EdTech Research at this 2017 symposium  which was funded by tech foundations and corporations, such as Bill and Melinda Gates, Chan-Zuck, Strada, Pearson, Carnegie… you get the idea.   In their report, Digital Promise acknowledges that the federal government has spent significant money on developing and disseminating technology-based products in the classroom with little to no information on how these products are working.  So, is the answer to rely on tech financed entities and unicorns to review and research the efficacy of future edtech products?  No conflict of interest there. Digital Promise also utilizes the heavily Gates funded and controversial Relay Graduate School, which you can read about here.

The Personalized Learning algorithm driven model does not work.

Digital Promise and others in edtech continue to push for online Personalized Learning despite many warnings from edtech insiders including this from Paul Merich, entitled Why I Left Silicon Valley, EdTech, and “Personalized” Learning. Merich’s concerns with the algorithmic driven Personalized Learning, are summed up with this quote,

“It was isolating with every child working on something different; it was impersonal with kids learning basic math skills from Khan Academy; it was disembodied and disconnected, with a computer constantly being a mediator between my students and me.”

And in this piece by Rick Hess, A Confession and a Question on Personalized Learning, the CEO of Amplify admits Personalized Learning is a failure. We wish every policy wonk and educrat would read this:

…“Until a few years ago, I was a great believer in what might be called the “engineering” model of personalized learning, which is still what most people mean by personalized learning. The model works as follows:

You start with a map of all the things that kids need to learn.

Then you measure the kids so that you can place each kid on the map in just the spot where they know everything behind them, and in front of them is what they should learn next.

Then you assemble a vast library of learning objects and ask an algorithm to sort through it to find the optimal learning object for each kid at that particular moment.

Then you make each kid use the learning object.

Then you measure the kids again. If they have learned what you wanted them to learn, you move them to the next place on the map. If they didn’t learn it, you try something simpler.

If the map, the assessments, and the library were used by millions of kids, then the algorithms would get smarter and smarter, and make better, more personalized choices about which things to put in front of which kids.

I spent a decade believing in this model—the map, the measure, and the library, all powered by big data algorithms.

Here’s the problem: The map doesn’t exist, the measurement is impossible, and we have, collectively, built only 5% of the library.

To be more precise: The map exists for early reading and the quantitative parts of K-8 mathematics, and much promising work on personalized learning has been done in these areas; but the map doesn’t exist for reading comprehension, or writing, or for the more complex areas of mathematical reasoning, or for any area of science or social studies. We aren’t sure whether you should learn about proteins then genes then traits—or traits, then genes, then proteins.

We also don’t have the assessments to place kids with any precision on the map. The existing measures are not high enough resolution to detect the thing that a kid should learn tomorrow. Our current precision would be like Google Maps trying to steer you home tonight using a GPS system that knows only that your location correlates highly with either Maryland or Virginia.

We also don’t have the library of learning objects for the kinds of difficulties that kids often encounter. Most of the available learning objects are in books that only work if you have read the previous page. And they aren’t indexed in ways that algorithms understand.

Finally, as if it were not enough of a problem that this is a system whose parts don’t exist, there’s a more fundamental breakdown: Just because the algorithms want a kid to learn the next thing doesn’t mean that a real kid actually wants to learn that thing.

So we need to move beyond this engineering model…” — Larry Berger, CEO of Amplify, excerpt Rick Hess Straight Up Blog [Emphasis added]

 

And…Digital Promise just published a 2018 report promoting “Personalized Learning”, co-authored by Tom Vander Ark, here.  In this report you can find such gems as this global mantra (including in the US) that learning and teaching knowledge is no longer the main goal of education, it is more important to gather data about how students think and feel.

According to the World Economic Forumthe top five most valued skills for workers in 2020 are: 1) complex problem solving; 2) critical thinking; 3) creativity; 4) people management; and 5) coordinating with others. This is a far cry from simply needing a grasp of reading, writing, and arithmetic to be marketable to employers. While mastery of the three Rs remains critical, it is merely the launching point and no longer the end goal. We need to re-think the education system”  –US Department of Education’s Digital Promise http://digitalpromise.org/wp-content/uploads/2018/01/lps-policies_practices-r3.pdf

Getting Smart, Tom Vander Ark

Tom Vander Ark is Getting Smart author, creator and is the “director of 4.0 Schools, Charter Board Partners, Digital Learning Institute, eduInnovation, and Imagination Foundation, and advises numerous nonprofits.” Vander Ark was also the former Executive Director of Education for Microsoft.  Vander Ark, in this 2011 video said that Common Core’s mandate of online assessments could be used as a lever to get computers into the classroom, computers for personalized learning to help replace teachers. Tom Vander Ark also said gone are the “days of data poverty” once we use online formative tests rather than end of year high stakes tests. Vander Ark is also featured in this Global Education Futures conference; notice that Vander Ark is speaking on how to Unbundle Billions in Education.

Dell Foundation.

What could Dell computers possibly have to do with tech in schools and student data you ask? For starters, Dell funds some heavy hitters in data analytics, such as McKinsey and Boston Consulting Group. Dell also has a “free” app for high school students called Scholar Snap, which handles students’ personal scholarship data. Interestingly, Scholar Snap is also partnered with the Common App, both of which are third party vendors within Naviance, a K-12 Workforce data platform. (You can read about Naviance and their data mining, including how Common App asks students to waive their FERPA rights by clicking here.) Additionally, Dell (along with Gates) helps fund CoSN, the makers of the (industry self-policing, self-awarding) Trusted Learning Environment Seal for Student Data. CoSN  also promotes data collection and personalized learning.  Their “data driven decision making mission” is to “help schools and districts move beyond data collection to use data to inform instructional practice and personalize learning“. Not surprisingly, CoSN is also co-author of this Horizon Report, touting the virtues of Virtual Reality (VR) and robotics and wearable tech, expected to be adopted in K-12 education within the next 3 to 5 years.

The wearable format enables the convenient integration of tools into users’ everyday lives, allowing seamless tracking of personal data such as sleep, movement, location, and social media interactions. Head-mounted wearable displays such as Oculus Rift and Google Cardboard facilitate immersive virtual reality experiences. Well-positioned to advance the quantified self movement, today’s wearables not only track where people go, what they do, and how much time they spend doing it, but now what their aspirations are and when those can be accomplished.”  –CoSN Horizon Report 2018

Side note: It’s not just students who will be required to track and share their biometric and personal data. As this New York Times piece reports, teachers in West Virginia were required to submit their personal information to a health tracking app or risk a $500 penalty.

They implemented Go365, which is an app that I’m supposed to download on my phone, to track my steps, to earn points through this app. If I don’t earn enough points, and if I choose not to use the app, then I’m penalized $500 at the end of the year. People felt that was very invasive, to have to download that app and to be forced into turning over sensitive information.

The Future of Privacy Forum

The Future of Privacy Forum, is a Project Unicorn partner and DC think tank funded by many tech foundations and corporations including but not limited to: Amazon, Apple, AT&T, Comcast, Facebook, Google, Microsoft, Verizon, Samsung, Sidewalk Labs (Google’s Alphabet, Smart Cities), Walt Disney, Bill & Melinda Gates Foundation, National Science Foundation. Hobsons (Naviance), Intel, Palintir, Pearson, Netflix, Mozilla name only a few of their big name supporters. Their K12  arm focuses on balancing student data privacy while supporting innovation and technology in the classroom.

New technologies are allowing information to flow within schools and beyond, enabling new learning environments and providing new tools to improve the way teachers teach and the way students learn. Data-driven innovations are bringing advances in teaching and learning but are accompanied by concerns about how education data, particularly student-generated data, are being collected and used.

The Future of Privacy Forum believes that there are critical improvements to learning that are enabled by data and technology, and that the use of data and technology is not antithetical to protecting student privacy. In order to facilitate this balance, FPF equips and connects advocates, industry, policymakers, and practitioners with substantive practices, policies, and other solutions to address education privacy challenges.

While it is fantastic to have such a well-funded group concerned about student privacy, we wish they would go further. The Future of Privacy Forum  doesn’t advocate for student and parent consent before taking or using student data, nor do they say students should own their own data. We wish they advocated for the right of parents to be ensured paper pencil / book / human face to face teacher alternatives to online curriculum.  We also wish that Future of Privacy Forum would better highlight that predictive algorithms are not regulated or transparent; meta data and personalized, adaptive learning are exempted from state privacy laws, often with this or very similar language:

Nothing in this section

And though the Future of Privacy Forum does promote technology in the classroom, screen addiction is a concern for parents. (Although tech addiction has seen increased media coverage as of late, it’s not new; see this 2015  New York Times article on the toll that screen addiction has on children. However, surprisingly, some would still argue that tech is not addictive. ) When promoting technology in the classroom, the Future of Privacy Forum could do a better job addressing the many well-documented health risks of screen use including behavioral changes, link to teen depression and suicide, sleep disturbance, damage to retinas and vision loss, and better highlight guidance from the American Academy of Pediatricians, warning that wireless devices and cell phones can cause cancer.

Common Sense Media

Common Sense Media is a nonprofit who is supported by several foundations, including but not limited to: The Bezos (Amazon) Family Foundation, The Bill and Melinda Gates Foundation, The William and Flora Hewlett FoundationCarnegie Corporation of NY,  Eli and Edythe Broad Foundation, Michael & Susan Dell Foundation,Overdeck Family Foundation, R.K. Mellon Foundation Symantec ,The Anschutz Foundation,  Annie E. Casey Foundation.  Another of their investors states that, “Common Sense Media provides unbiased and trustworthy information about media and entertainment that helps parents and children make informed choices about the content they consume.”

Can Project Unicorn or any of its Partners truly claim to be unbiased, since they are funded by the data driven tech industry? Since they are in a position to inform and advise on education policy, this is an important question.

Common Sense Media, even after hosting an event about tech addiction, see Truth About Tech below, is still advocating that only certain screen time exposure is addictive or concerning. Common Sense says when it comes to screen time, “there really is no magic number that’s “just right.”   Parents would argue that while content is certainly important, addiction, retinal damage, cancer risk, permissionless data collection, online safety risks apply to both educational and non-educational screen time, and affect children regardless of digital content.

Common Sense Tweet

To their credit, Common Sense Kids Action recently hosted a full day conference (video) on “Truth About Tech– How tech has our kids hooked.” It is great to get this conversation into the spotlight , you can see the agenda here, but there was no mention of giving students and parents ownership and control of how student data is collected, analyzed and shared. With online personalized learning and 1:1 devices being pushed at students as early as kindergarten and preschool, and no laws regulating meta data, data analytics, hidden algorithms, limiting screen time in schools and consent for data collection should have been discussed. Instead, Common Sense along with Project Unicorn is focused on data interoperability to keep the K-12 data flowing and will continue to ask parents to better control children’s screen time use at home.

Common Sense YouTube

The last segment of Common Sense’s Truth About Tech event, entitled “Solutions for Families, Schools, and Democracy” was moderated by Rebecca Randall, Vice President of Education Programs, Common Sense with guest speakers and Common Sense partners Dr. Carrie James, research associate, Project Zero, Harvard School of Education,, and Randima Fernando, Center for Humane Technology. This entire piece is worth your time, Mr. Fernando had some excellent points on gaming and technology.  However, we are going to focus on Dr. James’ comments since, as Ms. Randall mentions, it is on Dr. James’ work regarding digital ethics that Common Sense bases their K-12 digital literacy and citizenship curriculum.  Common Sense Media is about to begin working again with Dr. James and Harvard’s Project Zero to develop updated K-12 digital guidance.

At 49 minute mark,  Dr. James remarks:

“In answering a question around parents as role models, responded that, “We have a growing pile of evidence to suggest that parents are not doing a great job in this regard in recent research that we’re doing with Common Sense we’ve reached out to schools and teachers across the country and in a couple of countries around the world and asked you know what are some of the most memorable digital challenges your schools have faced and a surprising number of them have to do with parents.”

With screens being so addictive, we agree that many parents and most of society undoubtedly could be better screen time role models, we disagree with Common Sense’s continued emphasis only on non-educational screen use. We hope that Common Sense, their partners at Harvard Project Zero who will be working on new digital literacy citizenship curriculum, will consider age appropriate screen use, health and safety guidelines, parental consent and data ownership for children using devices and screens for educational purposes, including online homework. Parents send their children to school expecting them to be safe. Many parents do not want their children required to use screens and technology for regular coursework and when learning core subjects.  Many parents are uncomfortable with online personalized learning and would prefer face to face human teachers and text books as an option. The cost of attending public schools should not be mandatory screen exposure and loss of privacy. We hope that Common Sense will address these concerns in their work.

Project Unicorn is Promoting Interoperability. What is it?

An April 2017 Clayton Christensen Institute blog posted on the Project Unicorn news website explains the path of data interoperability as this,

“The first path toward interoperability evolves when industry leaders meet to agree on standards for new technologies. With standards, software providers electively conform to a set of rules for cataloging and sharing data. The problem with this approach in the current education landscape is that software vendors don’t have incentives to conform to standards. Their goal is to optimize the content and usability of their own software and serve as a one-stop shop for student data, not to constrain their software architecture so that their data is more useful to third parties.

Until schools and teachers prioritize interoperability over other features in their software purchasing decisions, standards will continue to fall by the wayside with technology developers. Efforts led by the Ed-Fi Alliance, the Access for Learning Community, and the federal government’s Common Education Data Standards program, all aim to promote common sets of data standards. In parallel with their (sic) these efforts, promising initiatives like the Project Unicorn pledge encourage school systems to increase demand for interoperability.”  [Emphasis added] https://www.christenseninstitute.org/blog/making-student-data-usable-innovation-theory-tells-us-interoperability/

A one-stop shop for student data, flowing seamlessly for third parties: Interoperability. 

How will  Project Unicorn help give students ownership of their data? Will students have consent and control over their data? We asked. 

Interestingly, up until a few days ago, Project Unicorn’s twitter profile stated that their focus is “shifting the ownership of data to schools and students.” See this screenshot from February 18, 2018 and a twitter conversation below.

Project Unicorn Tweet 2Project Unicorn replied the following day but they did not immediately answer my question about student data consent and ownership. Instead, they listed a few of their partners: Data Quality Campaign, Future of Privacy, Common Sense Media, National PTA. Again, I asked them about their statement about shifting ownership of data to the student.

Project Unicorn Tweet 3

Project Unicorn Tweet 4

Gretchen Logue also replied to Project Unicorn and their partners, asking if students can NOT have their data shared. Two days later, she still had not received a reply.

Logue

I directly asked Project Unicorn’s partner, Digital Promise to help answer whether students can consent to data collection. (Remember, DP is the edtech /personalized learning promoting non-profit created by the US Department of Ed.)  Digital Promise never responded to this parent’s questions. Maybe they just need a little more time or maybe parents aren’t important enough to bother with?

Tweet 5

tweet 6

tweet 7

Project Unicorn replied: they changed their twitter profile to better reflect the scope of their projectThey no longer claim to shift data ownership to students. They are promoting data interoperability. To be clear: they are NOT giving students ownership of their data. See their new twitter profile in this February 23, 2018 screen shot below.

Project Unicon interoperability

Why do edtech companies and our government have such a problem giving students consent and true ownership of their data? Data is money. Data is identity.  Student data is NOT theirs to take. 

Without the student, the data does not exist. If a student writes an essay for a class assignment, that written work belongs to the student. If a student draws a picture in art class, that artwork is theirs. Parents (and the Fourth Amendment) would argue that personal information about a student, created by a student, should belong to the student.

#TruthinTech: Unicorns are taking student data and sharing it without consent. What say you @HumaneTech?

Humane tech

Tech is hacking kids brains, but it is also stealing their data, students’ every keystroke can be collected and analyzed and student education records can be shared.  (FERPA is a 40 year old law that doesn’t cover data or meta data, or algorithms and was substantially weakened  in 2011 to allow personally identifiable information to be shared outside of the school with nonprofits, researchers, anyone approved as a school official or  educational purpose–without parent consent or knowledge). HumaneTech folks, are you good with this predictive profiling, leveraging and capitalizing of children who are held hostage in this mandatory surveilled school system? Schools are the new smart cities –except children are a captive audience and they are being exploited. They have no choice.

Why not do real, independent research, set guidelines and protect kids from screens in schools? Why not give parents and students a choice of tech vs paper, allow the option of learning knowledge vs in-school personality surveys and emotional assessments and biometric health trackers? Why not be transparent about algorithms and analytics and get consent BEFORE collecting and using student or teacher data?

GDPR.

Europe requires consent before collecting and sharing personal data, including automated decision making. GDPR gives Europeans (including students) more control on how their data is handled, including breach notification and penalty, data redaction, and consent. Why would American students be any less deserving than students in Europe? GDPR will have global implications.  Modernizing FERPA and COPPA to align with GDPR would be both practical and ethical. Why isn’t Project Unicorn also advocating for the GDPR standard of basic human privacy and data identity rights for American citizens and children? 

A final question note. Project Unicorn is not an elected, governing body, are they directing US education policy? Decisions should be made democratically, by those closest to the children, instead of by a few billionaires. What gives philonthro-funders the right to leverage children’s data and encourage schools with their procurement $trategies? The Edtech Billionaires directing education-experimenting on children have created (and are profiting from) this data driven problem: teachers are so busy collecting endless data points they don’t have the time or the freedom to teach. Now the regretful tech industry, wants to swoop in and make the data collection process easier, free up teachers (or replace them?), with a Single-Sign-On Standardized data collection tool. Children are not a product to be leveraged.  Please stop using schools and children as a permissionless innovation data supply.

IMS Global

And why oh why, Project Unicorn, are you working with IMS Global?  Uncommon Alliance indeed.

“…interoperability specification for educational click stream analytics created by the education community for the education community. Major educational suppliers are using Caliper to collect millions of events every week and the data is helping to shape teaching and learning on multiple levels. Several leading institutions are also working on putting Caliper in place. Now is a great time for both institutions and suppliers to begin putting learning analytics in place using Caliper.”

IMS Global Learning Consortium

-Cheri Kiesecker

An Interview with Alison McDowell: KEXP’s Mind Over Matters Community Forum

headphones

On August 5th Alison McDowell was a guest on KEXP’s news program Mind Over Matters. You can listen to the interview by clicking on the link below ( be patient – it takes a little bit of time for the file to load). A transcript of the interview follows.

Alison McDowell Interview

My concern as a parent is within these adaptive learning systems, I don’t want an online system that has to learn my child to work. I don’t want a system that has to know everything my child did for the last six months, to operate properly. Because I think that becomes problematic. How do you ever have a do over? Like, is it just always building and reinforcing certain patterns of behavior and how you react…it’s, they, I think they present it as flexible and personalized, but in many ways I think it’s limiting.

Mind Over Matters – KEXP

Community Forum

Interview with Alison McDowell

Mike McCormick:  It’s time once again for Community Forum, and we’re very lucky to have with us live in the studios this morning, Alison McDowell. Alison McDowell is a parent and researcher, into the dangers of corporate education reform. She was presenter this last March this year here in Seattle. The talk entitled Future Ready schools: How Silicon Valley and the Defense Department Plan to Remake Public Education. Alison, thank you very much for coming in and spending time with us this morning.

Alison: Oh, I’m very glad to be here. Thank you so much for having me.

Mike:  So, tell us, how did you get interested and involved with the issue of corporate education reform?

Alison: Well, I’m a I’m a parent. I have a daughter who is sixteen in the public schools of Philadelphia. And we’re sort of a crucible for many different aspects of education reform. We’ve had multiple superintendents from the Broad Academy. We’ve been defunded. Our schools have been, numerous of our schools have been closed, teachers laid off and about three years ago I became involved in the Opt Out movement for high stakes testing. Because at that point I felt that if we were able to withhold the data from that system we would try to be able to slow things down. Because they were using that testing data to close our schools. So I worked on that for a number of years until I saw that the landscape was starting to change. And a lot of it was leading up to the passage of the Every Student Succeeds Act. That that passage. And it seemed at that time that our school district, which is challenging in many respects, was all of a sudden actually interested in Opt Out, and making that, sharing information and materials… Pennsylvania has a legal Opt Out right on religious grounds…and making materials available in various languages. And something just didn’t compute in my head. I’m like, well, even if, if we’re entitled, the fact that they were interested in engaging with us on that, made me sort of question why that was. And then so post ESSA, it became clear that the shift that was going to be taking place was away from a high stakes end of year test and more towards embedded formative assessments. So in our district we’ve seen an influx, even though there isn’t funding for many other things, lots of technology coming in, lots of Chromebooks. Every, all of the students have Google accounts. Google runs our school district. Even though they say philsd.org, their Google accounts, and each student, their email address is actually their student id number. So to access a Chromebook as soon as you login, you know all of that information is tied back into their id number. So the technology was coming in. Many schools were doing multiple benchmark assessments. So there was less and less time for actual meaningful instruction throughout the school year and there were more and more tests taking place, many computerized. So, at that point, we were looking into like, what did this mean, what is the role of technology and the interim testing, in this movement And so, I had come across my…I have a blog. It’s called Wrench in the Gears. It’s a wordpress blog. So you, I have a lot of information there, and it’s all very well documented and linked. My colleague Emily Talmage, who’s a teacher in Maine, who has seen this first-hand. She has a blog: Save Maine Schools. And so I had found her blog and at one point she said, you know…you know, only click on this link, you know, if you’re willing to go down the rabbit hole. And at that point it was, it was a website called Global Education Futures Forum, and they have this agenda for education up to 2035. And it is their projection. And it’s a global…global membership led by Pavel Luksha, who’s connected with the Skolkovo Institute, in Russia. But the local person here, actually he’s very local, is Tom Vander Ark, is one of the US representatives. And so he was former Gates Foundation. And has his own consulting firm now. And it’s based out of Seattle. And, but anyway, so they have sort of what they call a foresight document, a sort of projecting based on trends and patterns, where they see things going for education, like over the next 20 years. And so really, they have a very sophisticated map. And all you have to do is sort of look at their map. And then match it up to current events. And you can see, like, where they’re pretty much on target where things are headed. And there, they have some really interesting infographics and, one of them, it’s a very decentralized system. So education is just like the individual at the center. So everything you’re hearing, personalized learning, and and individual education plans, like it’s one big person and you’re the center of your own universe. And sort of around you, there aren’t teachers or schools. It’s it’s many sort of digital interfaces, and devices, and data-gathering platforms. And this idea that education is a life-long process. Which I think all of us generally agree with, but the idea that you’re sort of chasing skills in this new global economy, and like constantly remaking yourself. Or like the gig economy and what that means. And managing your online reputation. Not just your skillsets. But your mindset. And your social outlook. And your behaviors. And the role of gamification. So there are many many elements to this, that if you look into it, I think raise a lot of questions. And increasingly, really over the past five years there’s been a lot of discussion about remaking education. Re-imagining education. You know, education for the 21st century. Future Ready Schools. And I think for the most part, parents and community members have been left out of this conversation, of what really does Future Ready Schools mean? And the folks who are running the conversation, are running the agenda, are largely coming from a tech background. And this is something that’s built up since the mid-nineties, when the Advanced Distributed Learning Program was set up within the Defense Department, and the Department of Education.  To have like you know, Tech Learning for all Americans. Which, you know, again  I think we all need to be tech knowledgable, I, the question is, how is the tech used and how in control of of your education are you, and your educational data. So anyway, a lot of this is being driven by interests of digitizing education. And really, through austerity mechanisms, pulling out more human interaction, out of the equation. So we’re, we’re seeing things that a number of years ago, Detroit, had a kindergarten, where they would have a hundred kindergarteners, with like one teacher and a couple of aides, and a lot of technology. So there’re lots of questions increasingly about the use of technology especially in early grades, and I know in, in Washington State there’ve been a big push for tablets down to the kindergarten level. Our children are being part of this sort of larger experiment that has health considerations that have not been closely examined. In terms of eyestrain, audio components, even hygiene with earphones. The wifi aspects. And then also the data collection. So, there’s this grand experiment going on for Future Ready Schools, and parents and community members aren’t really aware of the fact that it is an unproven experiment, and what the implications are long-term.

Mike: And it’s being driven heavily by corporations that are producing these platforms, this software, the electronics, kind of behind the scenes, because no one knows this is going on except a select group of administrators and teachers?

Alison: Yeah, well so they have, there are a number of like pilot districts. So the idea is sort of, you get a beachhead, and then you, you roll it out. You convince, I mean they have very sophisticated marketing manuals. Like Education Elements, they say, this is how you do it. You know first you, you have a social media campaign, you get the young teachers who are really into tech and you train them up in the way that you wanna do things, and then they mentor all the veteran teachers and you get the principal on board and then you have the parent meetings and it’s…again…with…if you understood it as, like selling a corporate product as opposed to public education, it might not be so disturbing. Like for me, I find having this sort of corporate approach to marketing, a new approach to public education. That’s, that’s what, what I find disturbing. I’ve called this Education 2.0, because I think we’re, we’re about to see a shift from the earlier version of privatization, which was the high stakes, end of year high stakes testing, vouchers, charter schools. Those things will all still continue, but they’ve, they were never the end game.  So they have been used as a way to de-stabilize the, the landscape of neighborhood schools. And in many cases they’ve been used to, you know, acquire real estate, further sort of gentrification, insider contracts, like there are many aspects that allow that to become a profit center. But there’s going to be a point of diminishing return. Where sort of like all the easy pickings have been taken. And if you’re pursuing sort of a tailoristic model , like the ultimate efficiency, lean production, Cyber-Education is the end game. So creating a system of education that really has very little in human resources.  There’s lots of folks within Pearson and IBM and Microsoft who are looking at AI, like everyone will have your own artificial intelligent, like learning sherpa for your life. You know, and this isn’t just K12, this is forever.  You know, someone on your shoulder telling you what you should be doing next. But removing the humans out of the equation and putting more technology in place. So I think that’s what this shift to Education 2.0 is going to be about, is largely cyber but I think most parents at this point are not comfortable with that model. They wouldn’t say, you know, and I will admit, like there, there’s a small group of kids who are highly motivated for whom a cyber, exclusively cyber model may work. I mean a lot of the research shows that for most kids the outcomes are not great. So what they will be selling is project based learning. And that’s what you’ll hear a lot about, coming up, like in the next couple of years. But those projects won’t necessarily be linked to schools. So you’ll hear more and more about, anytime, anyplace, anywhere, any pace learning. So they’re looking to de- disconnect education from physical school buildings, and actual teachers in classrooms, to sort of what’s called a learning eco-system model. So something that’s more free-flowing, you’re just out in the world collecting skills. And that’s what was so interesting about, like the Common Core State Standards set-up. And I know a lot of states have sort of rolled back or renamed them. But the idea of having education tied to very specific standards, was a way of atomizing education and making it available for digitization. So if, if education is a human process of growth and development, that’s very murky to try to put in a metric, right? You need bits and bytes. And so if you create an education that’s strictly around standards and like sub standards and little sets, you can just aggregate those, and collect them or not collect them, and run that as data in a digital platform. So that push toward standards, yes it allowed for school report cards and value added modeling and things that hurt schools and teachers, but it also normalized the idea that education was less a human process and more people collecting things. Like collecting skills and standards, which is what you need for like a competency based education approach.

Mike: So, talk about some of the specific examples…one of the advantages to going into your site is you have links to so many different documents from the very corporations and people that are producing these systems. And one of the examples you’ve talked about in your talk back here in March was something called Tutormate? That was involved, kids getting pulled out of class, to go see, basically AI icons talking to them and they become attached to them…

Alison: Yeah…

Mike: …it’s disturbing.

Alison: Well there were a couple of, there’s a couple of interesting things. I had sort of a slide saying who’s teaching your children? Because increasingly it’s not necessarily their classroom teacher. The chatbot was actually Reasoning Mind, which is a math program. It was developed in Texas. And so it’s been like long-running and gotten a lot of funding, both from public and private sources. About refining sort of a personalized learning towards math. But kids were interacting with these online chat bots and developing connections and relationships to these online presences in their math program. I’m in Pennsylvania. So a lot of, a lot of things are developing in Pittsburgh. They have a whole initiative called Remake Learning in Pittsburgh which I believe is sort of early-stage learning ecosystem model and a lot of that is coming out of Carnegie Mellon because Carnegie Mellon is doing a lot of work on AI and education. And they have something called Alex. So they like the idea of peer-based learning. That sounds attractive like, yeah, kids like to learn from their peers. This, their version of peer-based learning is that you have a giant avatar cartoon peer on a screen and the children interact with this peer on a screen. So that’s something that’s being piloted in southwestern Pennsylvania right now. And then Tutormate is actually a different variation but they were pulling kids out of class, away…these were young children, from their classroom setting to put them in a computer lab to do tutoring with a corporate volunteer via skype, and an online platform. So in this case it actually was a human being, but this was during school hours. This was not a supplement to classroom instruction, this was in lieu of having direct instruction with a certified teacher. They were being put into an online platform with a corporate volunteer and you know, it turns out a number of the sponsors of that program had ties to defense contracting industries. You know, Halliburton, and Booz Allen Hamilton. You know, things that you might wanna question, is that who you want your second grader spending their time chatting with? You know, in lieu of having their second grade teacher teach them reading. So again, there is this shift away from, from teachers. There’s, there’s a model that’s going on right now, within many one-to-one device districts, so districts where every child has their own device. Young kids often have tablets, older kids have Chromebooks, in high-end districts you might have an actual laptop, with some hard-drive on it. The Clayton Christensen Institute, or Innosight Institute, they’ve been pushing blended learning. So blended learning is this new model. Where, there are a number of different ways you can…flipped classrooms, which many people have heard of…but there’s one called a rotational model. So children only have direct access to a teacher a third of the time. Like the class would be split into three groups. And you would be with a teacher for a third of the time, doing peer work a third of the time, and doing online work a third of the time. So again, it’s a way of increasing class size supposedly, like supposedly the quality time you have when you’re with the teacher with the ten kids instead of thirty is supposed to be so great even though maybe you only get fifteen minutes. What’s happening in other districts is they’re saying the time where kids are not with their teachers, and they’re just doing online work, they don’t really need a teacher present, they could just have an aide. So that’s again, in terms of pushing out professional teachers, is that, well if kids are doing online learning, maybe you just need an Americorp volunteer, in the room, to make sure that no one’s  hurting them…each other. You know, and that they’re on, supposedly on task. You know I think that’s a worrisome trend. And even though they’ll sell blended learning as very tech forward and future ready, the kids don’t love spending time on these devices, like hour after hour after hour. And my concern as a parent is…we’re all starting to realize what the implications are for big data. And how we interact with online platforms, either in social media, or other adaptive situations. And how, that these devices are actually gathering data, on ourselves.. .so, they they gather information through keystroke patterns, they all have cameras, they all, you know, the tablets have TouchSense, so theoretically there’s body temperature and pulse sensors. Like there’s many many elements, are they all being used now? No, but there is that capacity for using them to develop that level of engagement. To understand how you’re interacting with these programs. And that’s being developed through, with the Army Research Lab and USC, their Institute for Creative Technologies. And they are developing, a lot of this is being developed in conjunction with the Defense Department, for their interactive intelligent tutoring systems and with the Navy actually, which is relevant to Seattle. A lot of these early prototyped intelligent tutoring systems have been developed specifically with the Navy in mind. Training very specifically on computer programs, and optimizing that. But once they develop the infrastructure, then they’re able to apply that in non-military settings. And so it’s, it’s making its way out. So there’s a lot of data that can be collected and the other, the other push that you’ll start to see is gamification. So games, like gaming in schools. And kids love games, like parents love games. It sounds so fun. But I think what we have to realize is there’s a lot of behavioral data that’s coming out of the gaming too. That we’re not necessarily aware of.  And so this push for gamification, or sometime…like gamified classroom management systems. So Google has something called Classcraft. And all the kids have avatars. And like if they’re behaving in class, they can, you know they earn points, or have points deducted, and you’re on teams, and you can save your team member or not. And with ESSA, having passed, you know, they’ll tell the story that like we care about more than just test scores, we really wanna care about the whole child, we wanna, you know we we care about children as individuals. Really they wanna collect all of this data, not just on your academic skills, but on your behaviors, and your mindset. And are you gritty, and are you a leader, or are you, you know, flexible, are you resilient. And these, these gamified platforms, whether they’re run by the teacher, or gaming that’s done with the students in these simulations, and also AR/VR, augmented reality/virtual reality games that you’re starting to see. There’s just a lot of information going through, and you have to wonder, how is it being used, what are the privacy implications, and also what are the feedback loops being created? In terms of how you interact with a platform. Is it reinforcing aspects of your personality that you may or may not want reinforced. My concern as a parent is within these adaptive learning systems, I don’t want an online system that has to learn my child to work. I don’t want a system that has to know everything my child did for the last six months, to operate properly. Because I think that becomes problematic. How do you ever have a do over? Like, is it just always building and reinforcing certain patterns of behavior and how you react…it’s, they, I think they present it as flexible and personalized, but in many ways I think it’s limiting.

Mike: In some of the documentation you present, they have systems that wanna pay attention to whether a person that is working with the program is getting bored, or falling asleep, or whatever, so they were like watching like you know, the eye, literally to see if it’s like where it’s wandering off to…you said they potentially could be checking your, your temperature, your heart rate…

Alison: I mean, you know, are they doing it right now? I don’t know that they, but the capacity is there. And…

Mike: And all that data is being saved somewhere. And shared. In some capacity. We don’t know.

Alison: W…and I think it’s very unclear. And I think they’re, they’re many parents who are very concerned about privacy and working that angle of controlling what data goes in…I mean I think all of us are aware that once something is up in the cloud, even if there are promises made about privacy and protections, that nothing is really safe up there. In terms of from hacking, or even just legal. Like FERPA is very, the education records, sort of, privacy has a lot of loopholes. You know anyone who, many of these organizations, companies are third parties are designated agents of school districts. So they have access to this information. And I will also mention Naviance, because the other shift that we’re seeing happening is the shift towards creating an education system that is geared towards workforce development. That, that, that children at younger and younger ages should, should be identifying their passions, and finding their personal pathways to the workforce and the economy. And so Naviance is one of a number of companies that does strengths assessments and surveys. And many states you can’t get your diploma unless your child does a complete battery of assessments, personality assessment through Naviance, which is this third-party program. Also linking towards like their future college plans, and other things linked in, and very detailed information about people’s family situations. So again, the, the amount of data that’s being collected on many many different levels to supposedly like guide students moving forward into the economy, I think it merits a larger conversation. And I’m not saying that everyone needs to agree with my position, but I think that the, the agenda that’s being moved forward is being done in a way that for the most part, parents and community members, there’s not been a consensus reached, with us. That this is okay. That this new version of school is, is what we desire.

Mike: And being a parent in the Philadelphia School District, when these new systems are, have been implemented, you know, and the potential use of all, gathering of all your child’s data, I mean, have you been consulted on that prior? Did, every time they bring in a new system did they let you know, oh, we have another piece of software here that potentially could be, you know, data-mining your kid, are you okay with that?

Alison: So I think on the, on the plus side, because we have been so severely defunded, we haven’t seen quite as much of an influx of tech yet. Although I, I anticipate it’s coming. We’ve just had a big roll-out of Minecraft I think in schools. That’s their new thing that they’re, they’re all…there are a number of schools, like within turnaround sort of, that, that are being piloted for these one-to-one devices. I will say that there was an opt-out form for Google Apps for Education. Which is, and I so I opted, I opted my child out of Google Apps for Education. I may have been the only parent in the Philadelphia School District who did that, and it, it makes it complicated because again, there, it’s convenient, you know, it’s a nice, you know, way for teachers not to have to carry around lots of papers, and they have kids put it all on their Google drive. But I, I think we’re all starting to be a little wary about the amount of information and power that Google has, you know, in the world and what the implications are for that. So I think if, if people have concerns around some of these privacy aspects, you know, that’s, that’s a potential starting, starting place, is to opt out of Google Apps for Education, and see where that goes. Or even have targeted like device and data strikes, during the school year. So we don’t get a notice every time there’s a new program. I guess long story short.

Mike: Just a few minutes left. And again, some of the companies, in addition to Defense Department having early hooks into education reform, and online learning, some of the companies involved, and heavily investing in this, as an example, like Halliburton and Booz Allen, which to me, let’s say Booz Allen which is also heavily tied into doing, they have access to data bases that the NSA does and, Edward Snowden worked for Booz Allen.

Alison: I would say like right now, like the Chan Zuckerberg Initiative, LLC, is huge and they’re pushing Summit Basecamp. I know we just have a few min…minutes in closing so I also wanna mention, in addition to tech, we also have global finance interests involved, because in ESSA there are provisions for Pay for Success. Which is where they’re looking to use private venture capital to affect educational outcomes. Either right now it’s in universal pre-k, also early literacy. So we need to be aware of the role that Pay for Success is going to play in this, and that’s essentially like “moneyball” for government. Where they’re looking to save money. I mean there’s a conference that they, they’ve put this together. Evidence based policy. That’s what they call it. That’s sort of the code word. Is that if you can come up with a computerized program that will give you specific success metrics, venture capital can make money on that. So a lot of global finance interests, and impact investing interests are looking, I believe at education as a market, a futures market in student education data. So I have more information on that on my blog. But social impact bonds and Pay for Success are a critical piece to understanding why education is being digitized. Also Hewlett Packard, Microsoft, IBM, the tech interests, Summit Basecamp, AltSchool, Micro Schools are another big component of this. These value-model private schools, if vouchers go through, that, we’re gonna be seeing a lot more of that. The tech is also focusing on Montessori school models, and, and very high-end. So you have Rocketship Academy, which are sort of stripped down versions for low-income districts and, but they’re also marketing tech to affluent families and aspirational families as being sort of future-ready. So it’s really a, there’s many different branded versions of education technology.

Mike: So long story short, you have a kid in, going through school, or, you know, anyone you care about then, this would be something to look into.

Alison: Yes. Understand how much time they’re spending on devices. Advocate that school budgets prioritize human teachers, and reasonable class sizes, and not data-mining, not adaptive management systems. And and have this conversation in your community. Is education about creating opportunities for students to learn and grow together as a community, or is it these isolating personalized pathways, where people are competing against one another. And and I think that’s a larger conversation we all need to have in our school districts.

Mike: Alright. We’re speaking with Alison McDowell. She is a parent and researcher in the Philadelphia school system. Produced a series,  Future Ready Schools: How Silicon Valley and the Defense Department Plan to Remake Public Education. And again, your website is…

Alison: Wrenchinthegears.com

Mike: Wrenchinthegears.com. And with that we’re unfortunately out of time. I want to thank you for coming and spending time with us this morning.

Alison: Thank you.