Manufacturing Consent: How to Engineer an Education Activist

Statue of Liberty in Disgust

If social media platforms can predict your behavior, advocacy groups can buy access to it. They also have the power to manipulate your actions – and good intentions –  to serve their own agenda.

Customer tracking, discriminatory pricing (think airlines), and behavioral design are mature disciplines in retail marketing and the gambling industry.

Social media pulls all of these practices together by collecting users’ personal information, repackaging this data to appeal to marketers, and then selling access to the highest bidder.

It’s a complete loop of commercialized personalization.

In order to keep the cycle going: Facebook, Twitter and other social media platforms use likes, retweets, and comments to keep their users engaged and eager to volunteer even more information. These hooks are similar to the tricks used to keep gamblers at the slots and in their seats.

Alison McDowel had this to say about adaptive learning systems:

My concern as a parent is within these adaptive learning systems, I don’t want an online system that has to learn my child to work. I don’t want a system that has to know everything my child did for the last six months, to operate properly. Because I think that becomes problematic. How do you ever have a do over? Like, is it just always building and reinforcing certain patterns of behavior and how you react…it’s, they, I think they present it as flexible and personalized, but in many ways I think it’s limiting.

What’s really different about the commercial personalization we experience on social media and the adaptive learning systems many fear are coming to public education under the guise of personalized learning?

Surveillance Capitalism and the Dawn of Nudge Activism

A popular dismissal of the encroaching surveillance state is ‘who cares if the government, commercial interests, or any other third party, has access to my personal information. I have nothing to hide.’

It’s a comforting argument, but misses the point. It’s not the data that’s the problem, but what can be done with it.

One piece of data could be harmless, but if it’s pooled with a millions of other bits and run through an algorithm, suddenly this information has the power to predicted your behavior.

If corporations can predict what you’re going to do next, they can also put a price on it, trade it, and build a whole market around it.

Von Shoshana Zuboff calls this evolution in big data mediated economics surveillance capitalism:

It’s now clear that this shift in the use of behavioral data was an historic turning point. Behavioral data that were once discarded or ignored were rediscovered as what I call behavioral surplus. Google’s dramatic success in “matching” ads to pages revealed the transformational value of this behavioral surplus as a means of generating revenue and ultimately turning investment into capital. Behavioral surplus was the game-changing zero-cost asset that could be diverted from service improvement toward a genuine market exchange. Key to this formula, however, is the fact that this new market exchange was not an exchange with users but rather with other companies who understood how to make money from bets on users’ future behavior. In this new context, users were no longer an end-in-themselves.  Instead they became a means to profits in  a new kind of marketplace in which users are neither buyers nor sellers nor products.  Users are the source of free raw material that feeds a new kind of manufacturing process.

Government is equally excited to get in on the predictive capabilities of behavioral surplus. Pay for Success – also known as social impact bonds – is all about creating new opportunities for Wall Street to bet on future behaviors as they pertain to education, policing, incarceration, and healthcare.

Engineering an Education Activist

If social media platforms can predict your behavior, advocacy groups can buy access to it. They also have the power to manipulate your actions – and good intentions –  to serve their own agenda.

Advocacy in this context loses its traditional meaning. Instead, it becomes a data driven exercise aimed at targeting individuals sympathetic to an organization’s issue and then encouraging these targets to repeat the campaign’s message over and over again throughout their social network(s).

Independent thought is discouraged. What’s important is your willingness to repeat the designated message.

In 2014, The Excellent Schools Now coalition, funded by Stand for Children and the League of Education Voters, launched a media advocacy campaign to convince the public to support a college and career ready diploma for Washington State.  (Excellent Schools Now – Final Report)

Who Would Make a Good Education Activist?

Feedback from social media provided the Excellent Schools Now coalition with in depth knowledge of who would be the best individuals to target as education advocates.

Desirable over-arching characteristics were: engagement in traditionally lefty-leaning issues, strong personal identification with the Democratic Party, and actively engaged with the issues they care about.

At a more granular level these individuals cared about civil liberties, transportation, gender and racial equality, alternative energy, gun control and consistently vote for Democrats.

 

Top Issues

 

Political Affiliation

 

Engagement Activities

Where’s the Nudge?

All of these individuals were over indexed in actively working on issues they care about – sharing their thoughts publicly, online, and in political articles.

The nudge would come from making the Excellent Schools Now message so attractive to potential targets that they would be unable to resist sharing it. This could be done by emphasizing the message’s connection to an admired member of the Democratic Party who also happens to shares the target’s individual sense of justice or equality.

 How to Shut Down Activism if it Gets Out of Hand

What happens when activists start thinking for themselves and no longer need an advocacy group to lead the way?

Don’t worry, the public relations firm West Third Group clearly lays out the time tested plan used to keep activists in their place and repeating the right messages.

Four types of activists — radicals, opportunists, idealists and realists — define most us-vs.-them public battles. Whether the issue is political, cultural or personal, dealing with movements antagonistic to your efforts involves dividing the different types, using different tactics for each group.

  • Isolate the radicals.
  • Get the opportunists on the payroll if needed, or ignore them.
  • Cultivate/educate the idealists and convert them to realists.
  • Co-opt the realists into agreeing to industry.

 

Repeating Messages on Social Media, Is That All There Is?

Jodi Dean has an interesting take on communicative capitalism.

In the United States today, however, they don’t, or, less bluntly put, there is a significant disconnect between politics circulating as content and official politics.  Today, the circulation of content in the dense, intensive networks of global communications relieves top-level actors (corporate, institutional, and governmental) from the obligation to respond. Rather than responding to messages sent by activists and critics, they counter with their own contributions to the circulating flow of communications, hoping that sufficient volume (whether in terms of number of contributions or the spectacular nature of a contribution) will give their contributions dominance or stickiness.  Instead of engaged debates, instead of contestations employing common terms, points of reference, or demarcated frontiers, we confront a multiplication of resistances and assertions so extensive that it hinders the formation of strong counter-hegemonies. The proliferation, distribution, acceleration, and intensification of communicative access and opportunity, far from enhancing democratic governance or resistance, results in precisely the opposite, the post-political formation of communicative capitalism.

Maybe technology isn’t designed to save us.

While we’re burning up our time tweeting, liking, and commenting, the hard work of organizing in the real world is left for another day.

Maybe that’s whole point.

-Carolyn Leith

 

 

-Carolyn Leith

 

Advertisements

A New Campaign! Classrooms, Not Computers: Stop Educating for Profit

Reposted with permission from Educationalchemy.

data-mining

We want to end the invasive corporate control of students, schools, and communities being pushed in the name of technology. We want to create actions to eliminate the mining, tracking, and surveillance of student data by government and corporate entities.

Technology is replacing teachers. Classrooms and students are becoming pipelines of data collection for the profit of private corporate interests. As we saw with the news about Cambridge Analytica and Facebook, we know that “he who owns the data rules the world”. Through powerful lobbying tech companies are transforming education. Technology that gathers information from students are replacing person-to-person interaction with teachers and ending hands on learning. The personal data students are forced (unknowingly) to provide these companies, is a gold mine of private information about our children.

We want to end the invasive corporate control of students, schools, and communities being pushed in the name of technology. We want to create actions to eliminate the mining, tracking, and surveillance of student data by government and corporate entities.

The outcomes of this campaign (one personal, one more social/public) are 1. Protect our individual children/students from corporate surveillance, and 2. Dismantle corporate-led education policies that place public education into the hands of private corporate interests intended toward greater social surveillance and control.

There are two problems we address. First, is to identify and share what the problem is (it’s complicated). Two, the problem is too big (technology is everywhere! How can we fight this?)

The problem is larger than the focus of this campaign alone (read more at datadisruptors.com).

Join us at Classrooms, Not Computers.

-Morna McDermott

Data Unicorns? Tech Giants and US Dept of Ed Form Alliance to Leverage Student Data — Without Parent Consent.

Reposted with permission from Missouri Education Watchdog

Leveraging Student Data

Project Unicorn: Billionaire partners promoting data interoperability and online “Personalized Learning”

When the Unicorns “protecting” student data are interoperable with the Unicorns taking it, parents and lawmakers might want to pay attention.

According to Technopedia, in the Information Technology world, “a unicorn is most commonly used to describe a company, for example, a Silicon Valley startup, that started out small but has since increased its market capitalization to, say, $1 billion or more. …For example, the social media giant Facebook, which has a market capitalization of more than $100 billion, is considered as a “super-unicorn among unicorns”.  Interesting coincidence because the name of a MEGA financed K-12 student data alliance is a unicorn.

Meet Project Unicorn.

Project Unicorn’s Mission is to Leverage Student Data and Make Data Interoperable

Project Unicorn

Project Unicorn’s steering committee is a who’s-who of edtech bundlers, billionaires, and student data power-players. They have formed an “uncommon alliance” committed to leveraging student data by making the data interoperable, flowing seamlessly, between all K-12 applications and platforms. While addressing student data security and privacy is a much needed conversation, it would seem that Project Unicorn has the cart before the horse. There is no talk of student data ownership or consent prior to collecting and using student data but rather, per this press release, Project Unicorn will continue to take the data, make data interoperable and talk about it afterwards, “Once interoperability is in place, we can start working with teachers and students to ask questions about the data.”  You can see by tweets below that Project Unicorn initially claimed it wanted to “shift data ownership to the student”; they have since withdrawn that statement.  Several schools and districts have been encouraged to join the Project Unicorn Coalition; we wonder if parents in these schools were given an option or are even aware of what this means. We’re going to talk about a few of the Project Unicorn partners and then circle back to their interoperability goals and how that fits with student data ownership, ethics, and the newly formed and related Truth About Tech and Humanetech.

A few points before we start:

  • When it comes to “free” edtech products, you know if it is free, you are the product; you pay with your data and your privacy. With edtech and 1:1 devices, personalized learning, online assessments online homework, LMS systems, students usually do not have a choice. Students do not have the ability to consent or opt out. Why?
  • Not all philanthropy is charity. As this article points out, for some, philanthropy is an investment, these nonprofits may “look” charitable but they are truly meant to make money and to buy power and influence policy, and sometimes do harm.
  • McKinsey Global estimated that increasing the use of student data in education could unlock between $900 billion and $1.2 trillion in global economic value. 
  • Children are not data points to predict, standardize and analyze. Currently online platforms can collect every key stroke, analyze and predict children’s behaviors. Children are not meant to be experimented on and#KidsAreNotInteroperable.
  • Currently, students’ data can be shared, researched, analyzed, marketed without parental consent. Often, parents cannot refuse the data sharing, cannot see the data points shared and how they are analyzed.
  • Edtech and Silicon Valley companies can gain access to personal student information without parent consent, under the School Official exception in FERPA. The US Department of Education not only promotes edtech companies, it tells tech companies HOW to gain access to student data, and is partnered in this project to make data sharing interoperable.
  • Interoperable data systems will allow even larger, very predictive data profiles of children–everything they do, are. The best way to protect privacy is to not collect data in the first place. Interoperability, with bigger and more detailed, sensitive data sets, sharing and mixing data with third parties is risky for both privacy and security. The US Department of Education has already warned of cyber hackers ransoming sensitive data from schools; who will be responsible and liable for more data breaches?

Back to unicorns.

How is the US Department of Education involved with Project Unicorn? 

The USDoE (your tax dollars) has been a major driving force of funding and support in online education, and data interoperability. Part of the data interoperability requires common data standards. CEDS (Common Education Data Standards) are codes used to tag student data, you can see these over 1,700 different data codes or elements, in the federal student data dictionary.  These common data tags were created with the help of  Bill Gates, funder of the Data Quality Campaign; read about the mission of DQC at the US Department of Education Summit here. Data Quality Campaign also provides policy guidance to legislators and education agencies, such as this 2018 DQC Roadmap promoting Cross-Agency data sharing. With the shift in education focusing more on workforce talent pipelines (see both ESSA and WIOA), the Workforce Data Quality Campaign (Gates, Lumina, Arnold, Joyce Foundation funded) has also influenced the US Department of Labor. The US Department of Labor-Workforce Data Quality Initiative plans to use personal information from each student, starting in pre-school, via the states’ SLDS data system. You can read more about  the SLDS, the roles that the US Department of Education and Bill Gates play in student data collection, the weakening of federal privacy law FERPA  here. In recent years Microsoft’s commitment to data privacy has been called into question, as per this EdWeek article. Even Microsoft itself admits they can take a peek and trend through student data and can put it on the market.

“If students are using certain cloud infrastructures, and it’s held by a third party, it is possible for [the vendors] to trend through the data,” said Allyson Knox, director of education policy and programs for Microsoft. “When [information] is flowing through a data center, it’s possible to take a peek at it and find trends and put it on the market to other businesses who want to advertise to those students.”

Knox said Microsoft has a “remote data center” where student information is housed but that “students’ data belongs to them.” -Microsoft https://www.fedscoop.com/lawmakers-hear-testimony-on-student-data-and-privacy/                     

Does Microsoft still believe that student data belongs to the student?

Gates: In 5 Years

Microsoft, Bill and Melinda Gates Foundation

The Bill and Melinda Gates Foundation is a nonprofit whose IRS 990 forms can be seen here and (2016) here and TRUST here; their awarded grants can be seen in this searchable database. Gates spends billions on K-12 and higher ed reform. Gates (and Data Quality Campaign) both support a national student database, and now Gates is shifting his Multi-Billion focus from Common Core to K12 networks and curriculum.

(See With new focus on curriculum, Gates Foundation wades into tricky territory .)

Microsoft is desperately hoping to regain ground in the K-12 classroom 1:1 device market, with management systems, cloud, gamification of education (yes, Microsoft owns Minecraft and is promoting Minecraft in classrooms), K-12 LinkedIn Data Badges (yes, Microsoft owns LinkedIn-and yes there are LinkedIn K-12 badge pilots in AZ and CO), introducing chatbots and Artificial Intelligence into education and several online tools like Microsoft OneNote, favorably reviewed here by their unicorn partner Digital Promise. Microsoft is also part of the US Department of Education’s push for online curriculum, via Open Ed Resources OERs. Microsoft will be handling and indexing the content for the Federal Learning Registry. (You can read more about how the Federal Department of Defense and Department of Education are involved in OERs here.)

According to this December 2017 New York Times piece, Microsoft is fiercely trying to regain ground in the K-12 classroom market.

Tech companies are fiercely competing for business in primary and secondary schools in the United States, a technology market expected to reach $21 billion by 2020, according to estimates from Ibis Capital, a technology investment firm, and EdtechXGlobal, a conference company.

It is a matter of some urgency for Microsoft. 

Chromebooks accounted for 58 percent of the 12.6 million mobile devices shipped to primary and secondary schools in the United States last year, compared with less than 1 percent in 2012, according to Futuresource Consulting, a research company. By contrast, Windows laptops and tablets made up 21.6 percent of the mobile-device shipments to schools in the United States last year, down from about 43 percent in 2012. – https://www.nytimes.com/2017/05/02/technology/microsoft-google-educational-sales.html [Emphasis added]

Digital Promise

If you aren’t familiar with Digital Promise, it is a non-profit created by the US Department of Education, to PROMOTE edtech in the classroom. Read about Digital Promise and Global Digital Promise here. Digital Promise is demanding data interoperability for school districts. Digital Promise presented their report The Goals and Roles of Federal Funding for EdTech Research at this 2017 symposium  which was funded by tech foundations and corporations, such as Bill and Melinda Gates, Chan-Zuck, Strada, Pearson, Carnegie… you get the idea.   In their report, Digital Promise acknowledges that the federal government has spent significant money on developing and disseminating technology-based products in the classroom with little to no information on how these products are working.  So, is the answer to rely on tech financed entities and unicorns to review and research the efficacy of future edtech products?  No conflict of interest there. Digital Promise also utilizes the heavily Gates funded and controversial Relay Graduate School, which you can read about here.

The Personalized Learning algorithm driven model does not work.

Digital Promise and others in edtech continue to push for online Personalized Learning despite many warnings from edtech insiders including this from Paul Merich, entitled Why I Left Silicon Valley, EdTech, and “Personalized” Learning. Merich’s concerns with the algorithmic driven Personalized Learning, are summed up with this quote,

“It was isolating with every child working on something different; it was impersonal with kids learning basic math skills from Khan Academy; it was disembodied and disconnected, with a computer constantly being a mediator between my students and me.”

And in this piece by Rick Hess, A Confession and a Question on Personalized Learning, the CEO of Amplify admits Personalized Learning is a failure. We wish every policy wonk and educrat would read this:

…“Until a few years ago, I was a great believer in what might be called the “engineering” model of personalized learning, which is still what most people mean by personalized learning. The model works as follows:

You start with a map of all the things that kids need to learn.

Then you measure the kids so that you can place each kid on the map in just the spot where they know everything behind them, and in front of them is what they should learn next.

Then you assemble a vast library of learning objects and ask an algorithm to sort through it to find the optimal learning object for each kid at that particular moment.

Then you make each kid use the learning object.

Then you measure the kids again. If they have learned what you wanted them to learn, you move them to the next place on the map. If they didn’t learn it, you try something simpler.

If the map, the assessments, and the library were used by millions of kids, then the algorithms would get smarter and smarter, and make better, more personalized choices about which things to put in front of which kids.

I spent a decade believing in this model—the map, the measure, and the library, all powered by big data algorithms.

Here’s the problem: The map doesn’t exist, the measurement is impossible, and we have, collectively, built only 5% of the library.

To be more precise: The map exists for early reading and the quantitative parts of K-8 mathematics, and much promising work on personalized learning has been done in these areas; but the map doesn’t exist for reading comprehension, or writing, or for the more complex areas of mathematical reasoning, or for any area of science or social studies. We aren’t sure whether you should learn about proteins then genes then traits—or traits, then genes, then proteins.

We also don’t have the assessments to place kids with any precision on the map. The existing measures are not high enough resolution to detect the thing that a kid should learn tomorrow. Our current precision would be like Google Maps trying to steer you home tonight using a GPS system that knows only that your location correlates highly with either Maryland or Virginia.

We also don’t have the library of learning objects for the kinds of difficulties that kids often encounter. Most of the available learning objects are in books that only work if you have read the previous page. And they aren’t indexed in ways that algorithms understand.

Finally, as if it were not enough of a problem that this is a system whose parts don’t exist, there’s a more fundamental breakdown: Just because the algorithms want a kid to learn the next thing doesn’t mean that a real kid actually wants to learn that thing.

So we need to move beyond this engineering model…” — Larry Berger, CEO of Amplify, excerpt Rick Hess Straight Up Blog [Emphasis added]

 

And…Digital Promise just published a 2018 report promoting “Personalized Learning”, co-authored by Tom Vander Ark, here.  In this report you can find such gems as this global mantra (including in the US) that learning and teaching knowledge is no longer the main goal of education, it is more important to gather data about how students think and feel.

According to the World Economic Forumthe top five most valued skills for workers in 2020 are: 1) complex problem solving; 2) critical thinking; 3) creativity; 4) people management; and 5) coordinating with others. This is a far cry from simply needing a grasp of reading, writing, and arithmetic to be marketable to employers. While mastery of the three Rs remains critical, it is merely the launching point and no longer the end goal. We need to re-think the education system”  –US Department of Education’s Digital Promise http://digitalpromise.org/wp-content/uploads/2018/01/lps-policies_practices-r3.pdf

Getting Smart, Tom Vander Ark

Tom Vander Ark is Getting Smart author, creator and is the “director of 4.0 Schools, Charter Board Partners, Digital Learning Institute, eduInnovation, and Imagination Foundation, and advises numerous nonprofits.” Vander Ark was also the former Executive Director of Education for Microsoft.  Vander Ark, in this 2011 video said that Common Core’s mandate of online assessments could be used as a lever to get computers into the classroom, computers for personalized learning to help replace teachers. Tom Vander Ark also said gone are the “days of data poverty” once we use online formative tests rather than end of year high stakes tests. Vander Ark is also featured in this Global Education Futures conference; notice that Vander Ark is speaking on how to Unbundle Billions in Education.

Dell Foundation.

What could Dell computers possibly have to do with tech in schools and student data you ask? For starters, Dell funds some heavy hitters in data analytics, such as McKinsey and Boston Consulting Group. Dell also has a “free” app for high school students called Scholar Snap, which handles students’ personal scholarship data. Interestingly, Scholar Snap is also partnered with the Common App, both of which are third party vendors within Naviance, a K-12 Workforce data platform. (You can read about Naviance and their data mining, including how Common App asks students to waive their FERPA rights by clicking here.) Additionally, Dell (along with Gates) helps fund CoSN, the makers of the (industry self-policing, self-awarding) Trusted Learning Environment Seal for Student Data. CoSN  also promotes data collection and personalized learning.  Their “data driven decision making mission” is to “help schools and districts move beyond data collection to use data to inform instructional practice and personalize learning“. Not surprisingly, CoSN is also co-author of this Horizon Report, touting the virtues of Virtual Reality (VR) and robotics and wearable tech, expected to be adopted in K-12 education within the next 3 to 5 years.

The wearable format enables the convenient integration of tools into users’ everyday lives, allowing seamless tracking of personal data such as sleep, movement, location, and social media interactions. Head-mounted wearable displays such as Oculus Rift and Google Cardboard facilitate immersive virtual reality experiences. Well-positioned to advance the quantified self movement, today’s wearables not only track where people go, what they do, and how much time they spend doing it, but now what their aspirations are and when those can be accomplished.”  –CoSN Horizon Report 2018

Side note: It’s not just students who will be required to track and share their biometric and personal data. As this New York Times piece reports, teachers in West Virginia were required to submit their personal information to a health tracking app or risk a $500 penalty.

They implemented Go365, which is an app that I’m supposed to download on my phone, to track my steps, to earn points through this app. If I don’t earn enough points, and if I choose not to use the app, then I’m penalized $500 at the end of the year. People felt that was very invasive, to have to download that app and to be forced into turning over sensitive information.

The Future of Privacy Forum

The Future of Privacy Forum, is a Project Unicorn partner and DC think tank funded by many tech foundations and corporations including but not limited to: Amazon, Apple, AT&T, Comcast, Facebook, Google, Microsoft, Verizon, Samsung, Sidewalk Labs (Google’s Alphabet, Smart Cities), Walt Disney, Bill & Melinda Gates Foundation, National Science Foundation. Hobsons (Naviance), Intel, Palintir, Pearson, Netflix, Mozilla name only a few of their big name supporters. Their K12  arm focuses on balancing student data privacy while supporting innovation and technology in the classroom.

New technologies are allowing information to flow within schools and beyond, enabling new learning environments and providing new tools to improve the way teachers teach and the way students learn. Data-driven innovations are bringing advances in teaching and learning but are accompanied by concerns about how education data, particularly student-generated data, are being collected and used.

The Future of Privacy Forum believes that there are critical improvements to learning that are enabled by data and technology, and that the use of data and technology is not antithetical to protecting student privacy. In order to facilitate this balance, FPF equips and connects advocates, industry, policymakers, and practitioners with substantive practices, policies, and other solutions to address education privacy challenges.

While it is fantastic to have such a well-funded group concerned about student privacy, we wish they would go further. The Future of Privacy Forum  doesn’t advocate for student and parent consent before taking or using student data, nor do they say students should own their own data. We wish they advocated for the right of parents to be ensured paper pencil / book / human face to face teacher alternatives to online curriculum.  We also wish that Future of Privacy Forum would better highlight that predictive algorithms are not regulated or transparent; meta data and personalized, adaptive learning are exempted from state privacy laws, often with this or very similar language:

Nothing in this section

And though the Future of Privacy Forum does promote technology in the classroom, screen addiction is a concern for parents. (Although tech addiction has seen increased media coverage as of late, it’s not new; see this 2015  New York Times article on the toll that screen addiction has on children. However, surprisingly, some would still argue that tech is not addictive. ) When promoting technology in the classroom, the Future of Privacy Forum could do a better job addressing the many well-documented health risks of screen use including behavioral changes, link to teen depression and suicide, sleep disturbance, damage to retinas and vision loss, and better highlight guidance from the American Academy of Pediatricians, warning that wireless devices and cell phones can cause cancer.

Common Sense Media

Common Sense Media is a nonprofit who is supported by several foundations, including but not limited to: The Bezos (Amazon) Family Foundation, The Bill and Melinda Gates Foundation, The William and Flora Hewlett FoundationCarnegie Corporation of NY,  Eli and Edythe Broad Foundation, Michael & Susan Dell Foundation,Overdeck Family Foundation, R.K. Mellon Foundation Symantec ,The Anschutz Foundation,  Annie E. Casey Foundation.  Another of their investors states that, “Common Sense Media provides unbiased and trustworthy information about media and entertainment that helps parents and children make informed choices about the content they consume.”

Can Project Unicorn or any of its Partners truly claim to be unbiased, since they are funded by the data driven tech industry? Since they are in a position to inform and advise on education policy, this is an important question.

Common Sense Media, even after hosting an event about tech addiction, see Truth About Tech below, is still advocating that only certain screen time exposure is addictive or concerning. Common Sense says when it comes to screen time, “there really is no magic number that’s “just right.”   Parents would argue that while content is certainly important, addiction, retinal damage, cancer risk, permissionless data collection, online safety risks apply to both educational and non-educational screen time, and affect children regardless of digital content.

Common Sense Tweet

To their credit, Common Sense Kids Action recently hosted a full day conference (video) on “Truth About Tech– How tech has our kids hooked.” It is great to get this conversation into the spotlight , you can see the agenda here, but there was no mention of giving students and parents ownership and control of how student data is collected, analyzed and shared. With online personalized learning and 1:1 devices being pushed at students as early as kindergarten and preschool, and no laws regulating meta data, data analytics, hidden algorithms, limiting screen time in schools and consent for data collection should have been discussed. Instead, Common Sense along with Project Unicorn is focused on data interoperability to keep the K-12 data flowing and will continue to ask parents to better control children’s screen time use at home.

Common Sense YouTube

The last segment of Common Sense’s Truth About Tech event, entitled “Solutions for Families, Schools, and Democracy” was moderated by Rebecca Randall, Vice President of Education Programs, Common Sense with guest speakers and Common Sense partners Dr. Carrie James, research associate, Project Zero, Harvard School of Education,, and Randima Fernando, Center for Humane Technology. This entire piece is worth your time, Mr. Fernando had some excellent points on gaming and technology.  However, we are going to focus on Dr. James’ comments since, as Ms. Randall mentions, it is on Dr. James’ work regarding digital ethics that Common Sense bases their K-12 digital literacy and citizenship curriculum.  Common Sense Media is about to begin working again with Dr. James and Harvard’s Project Zero to develop updated K-12 digital guidance.

At 49 minute mark,  Dr. James remarks:

“In answering a question around parents as role models, responded that, “We have a growing pile of evidence to suggest that parents are not doing a great job in this regard in recent research that we’re doing with Common Sense we’ve reached out to schools and teachers across the country and in a couple of countries around the world and asked you know what are some of the most memorable digital challenges your schools have faced and a surprising number of them have to do with parents.”

With screens being so addictive, we agree that many parents and most of society undoubtedly could be better screen time role models, we disagree with Common Sense’s continued emphasis only on non-educational screen use. We hope that Common Sense, their partners at Harvard Project Zero who will be working on new digital literacy citizenship curriculum, will consider age appropriate screen use, health and safety guidelines, parental consent and data ownership for children using devices and screens for educational purposes, including online homework. Parents send their children to school expecting them to be safe. Many parents do not want their children required to use screens and technology for regular coursework and when learning core subjects.  Many parents are uncomfortable with online personalized learning and would prefer face to face human teachers and text books as an option. The cost of attending public schools should not be mandatory screen exposure and loss of privacy. We hope that Common Sense will address these concerns in their work.

Project Unicorn is Promoting Interoperability. What is it?

An April 2017 Clayton Christensen Institute blog posted on the Project Unicorn news website explains the path of data interoperability as this,

“The first path toward interoperability evolves when industry leaders meet to agree on standards for new technologies. With standards, software providers electively conform to a set of rules for cataloging and sharing data. The problem with this approach in the current education landscape is that software vendors don’t have incentives to conform to standards. Their goal is to optimize the content and usability of their own software and serve as a one-stop shop for student data, not to constrain their software architecture so that their data is more useful to third parties.

Until schools and teachers prioritize interoperability over other features in their software purchasing decisions, standards will continue to fall by the wayside with technology developers. Efforts led by the Ed-Fi Alliance, the Access for Learning Community, and the federal government’s Common Education Data Standards program, all aim to promote common sets of data standards. In parallel with their (sic) these efforts, promising initiatives like the Project Unicorn pledge encourage school systems to increase demand for interoperability.”  [Emphasis added] https://www.christenseninstitute.org/blog/making-student-data-usable-innovation-theory-tells-us-interoperability/

A one-stop shop for student data, flowing seamlessly for third parties: Interoperability. 

How will  Project Unicorn help give students ownership of their data? Will students have consent and control over their data? We asked. 

Interestingly, up until a few days ago, Project Unicorn’s twitter profile stated that their focus is “shifting the ownership of data to schools and students.” See this screenshot from February 18, 2018 and a twitter conversation below.

Project Unicorn Tweet 2Project Unicorn replied the following day but they did not immediately answer my question about student data consent and ownership. Instead, they listed a few of their partners: Data Quality Campaign, Future of Privacy, Common Sense Media, National PTA. Again, I asked them about their statement about shifting ownership of data to the student.

Project Unicorn Tweet 3

Project Unicorn Tweet 4

Gretchen Logue also replied to Project Unicorn and their partners, asking if students can NOT have their data shared. Two days later, she still had not received a reply.

Logue

I directly asked Project Unicorn’s partner, Digital Promise to help answer whether students can consent to data collection. (Remember, DP is the edtech /personalized learning promoting non-profit created by the US Department of Ed.)  Digital Promise never responded to this parent’s questions. Maybe they just need a little more time or maybe parents aren’t important enough to bother with?

Tweet 5

tweet 6

tweet 7

Project Unicorn replied: they changed their twitter profile to better reflect the scope of their projectThey no longer claim to shift data ownership to students. They are promoting data interoperability. To be clear: they are NOT giving students ownership of their data. See their new twitter profile in this February 23, 2018 screen shot below.

Project Unicon interoperability

Why do edtech companies and our government have such a problem giving students consent and true ownership of their data? Data is money. Data is identity.  Student data is NOT theirs to take. 

Without the student, the data does not exist. If a student writes an essay for a class assignment, that written work belongs to the student. If a student draws a picture in art class, that artwork is theirs. Parents (and the Fourth Amendment) would argue that personal information about a student, created by a student, should belong to the student.

#TruthinTech: Unicorns are taking student data and sharing it without consent. What say you @HumaneTech?

Humane tech

Tech is hacking kids brains, but it is also stealing their data, students’ every keystroke can be collected and analyzed and student education records can be shared.  (FERPA is a 40 year old law that doesn’t cover data or meta data, or algorithms and was substantially weakened  in 2011 to allow personally identifiable information to be shared outside of the school with nonprofits, researchers, anyone approved as a school official or  educational purpose–without parent consent or knowledge). HumaneTech folks, are you good with this predictive profiling, leveraging and capitalizing of children who are held hostage in this mandatory surveilled school system? Schools are the new smart cities –except children are a captive audience and they are being exploited. They have no choice.

Why not do real, independent research, set guidelines and protect kids from screens in schools? Why not give parents and students a choice of tech vs paper, allow the option of learning knowledge vs in-school personality surveys and emotional assessments and biometric health trackers? Why not be transparent about algorithms and analytics and get consent BEFORE collecting and using student or teacher data?

GDPR.

Europe requires consent before collecting and sharing personal data, including automated decision making. GDPR gives Europeans (including students) more control on how their data is handled, including breach notification and penalty, data redaction, and consent. Why would American students be any less deserving than students in Europe? GDPR will have global implications.  Modernizing FERPA and COPPA to align with GDPR would be both practical and ethical. Why isn’t Project Unicorn also advocating for the GDPR standard of basic human privacy and data identity rights for American citizens and children? 

A final question note. Project Unicorn is not an elected, governing body, are they directing US education policy? Decisions should be made democratically, by those closest to the children, instead of by a few billionaires. What gives philonthro-funders the right to leverage children’s data and encourage schools with their procurement $trategies? The Edtech Billionaires directing education-experimenting on children have created (and are profiting from) this data driven problem: teachers are so busy collecting endless data points they don’t have the time or the freedom to teach. Now the regretful tech industry, wants to swoop in and make the data collection process easier, free up teachers (or replace them?), with a Single-Sign-On Standardized data collection tool. Children are not a product to be leveraged.  Please stop using schools and children as a permissionless innovation data supply.

IMS Global

And why oh why, Project Unicorn, are you working with IMS Global?  Uncommon Alliance indeed.

“…interoperability specification for educational click stream analytics created by the education community for the education community. Major educational suppliers are using Caliper to collect millions of events every week and the data is helping to shape teaching and learning on multiple levels. Several leading institutions are also working on putting Caliper in place. Now is a great time for both institutions and suppliers to begin putting learning analytics in place using Caliper.”

IMS Global Learning Consortium

-Cheri Kiesecker

Dear Congress, you are being duped. HR4174-S2046 is a Privacy Fail. Here’s why. ( And please no more suspended rules and voice votes on these bills. )

Reposted with permission from  Missouri Education Watchdog.

not_for_sale

I will say it again… When it comes to their own children, parents have little to no say in education matters. Parents are not invited to fancy conferences, we often aren’t even allowed to attend them. Parents don’t have a travel budget, a lobby budget, or a paid assistant to help write rebuttals and policy briefs. Nope, we are moms and dads and grandparents doing the best we can to protect our children. And that is why I am responding to the federal government’s response to my blogpost opposing their bill(s) HR4174 and S2046, Foundations for Evidence-Based Policymaking Act of 2017.

Dear  Congress,

The GOP Majority Staff of the Congressional House Committee on Oversight and Government Reform wrote and distributed a response to my November 12  blogpost  that opposed HR4174.  This response, which folks can see here begins with,

The Eagle Forum and other groups representing interests such as home schooling have raised concerns about H.R. 4174, the Foundations for Evidence-Based Policymaking Act of 2017The concerns relate to how the bill would affect the privacy of citizens (especially school-aged children) whose data  is being stored by the federal government. Those concerns arise from a misunderstanding of what the bill does to the personal data that the government already has.”

Let me clear something up.  I am not a member of Eagle Forum nor am I a member of a home school group, not that I have anything against them; I just don’t want them to be responsible for what I say.  Missouri Education Watchdog lets me write on their blog but my views are my own. I am a mom. My special interests are my children. I write as a parent, because like many parent advocates, blogging is the only (small) way to be heard.

And No.

My concern DOES NOT “arise from a misunderstanding of what the bill does to the personal data that the government already has.”  You have it sort of right;  let me restate it:

MY CONCERN IS THAT THE GOVERNMENT HAS CITIZENS’ AND ESPECIALLY SCHOOL-AGED CHILDREN’S PERSONAL DATA, WITHOUT PERMISSION…AND IS EXPANDING ACCESS, ANALYSIS OF THIS DATA, AGAIN WITHOUT PERMISSION.

It’s not your data. Data belongs to the individual. Data is identity and data is currencyCollecting someone’s personal data without consent is theft. (When hackers took Equifax data, that was illegal. When the government takes data… no different.)

If you support parental rights, you should not support HR4174 or its sister bill S2046.  Parents are often left out of the conversation about laws affecting their children.

I will say it again… When it comes to their own children, parents have little to no say in education matters. Parents are not invited to fancy conferences, we often aren’t even allowed to attend them. Parents don’t have a travel budget, a lobby budget, or a paid assistant to help write rebuttals and policy briefs. Nope, we are moms and dads and grandparents doing the best we can to protect our children. And that is why I am responding to the federal government’s response to my blogpost opposing their bill(s) HR4174 and S2046, Foundations for Evidence-Based Policymaking Act of 2017.

I invite members of Congress and policy makers, rather than refute, or ignore, please have a discussion with those closest to the children: parents.

You impose legislation that directly impacts our children and our families, without our input. We elected you to represent us, “we the people”.    Please hear us, the parents. These are our children, not your human capital, not your data, not your property.

What follows are sections on:

  1. Brief status of student data collection
  2. History and mission of CEP Commission, current linking of IRS data, Census Data, Education data.
  3. China, the US, tech companies and collection, analysis of citizens’ data, dangers of algorithms, metadata profiling.
  4. Status of HR4174, voice votes and suspended rules (why this controversial bill should have had neither)
  5. FACTS. Links to bill text, refuting the House Oversight rebuttal.
  6. Here is a two pager citing only facts, bill text.   http://tinyurl.com/HR4174twopage

The current state of student data collection– You need to know this.

Bill Gates, who has spent billions on reforming education, creating and sharing standardized data, state databases, also wants a national student database, linking k-12 and higher ed data. According to The Gates Foundation 2016 Priorities, this is the national database infrastructure he has in mind. Coincidence?

Gates data infrastructure

State agencies currently maintain personally identifiable data about citizens, including  k-12 school children. My focus is on student data because student data are collected and shared  and analyzed without parent consent. Parents have a right to direct our children’s education and citizens have a right to be secure in their property.  …or do we?  Taking personal information about a child, and sharing it, without the parents’ knowledge or consent is (SHOCKINGLY)  legal, thanks to a 2011 executive rule change that weakened FERPA.

Any Congressperson who would like to spend his or her Thanksgiving dinner explaining to friends and relatives why you think taking personal information about a child and sharing it without parent consent is ethical or principled, please go ahead. Also, let them know that you passed a bill giving more access to this ill-gotten, personal information of students. Be my guest.

As for me, I find HR4174 collection, sharing of a school child’s personal data without parent consent, unconstitutional and unethical and a violation of children’s privacy and parental rights.

The Electronic Frontier Foundation also challenged nonconsensual sharing of students’ personal information and the weakening of FERPA. See the EPIC lawsuit against the US Department of Education here.

Very personal information about k-12 students (ie: personal background info on kindergarten-12  registration forms, demographics, race,  health records, disability status, income status, a multitude of invasive surveys, even personality tests, etc.)  is currently collected at all public k-12 schools and can be shared outside of the school, without the parents’ knowledge.  Many have said for years,student data collection is out of control and we are not protecting children:  Asleep at the Switch: Schoolhouse Commercialism, Student Privacy, and the Failure of Policymaking.

Meta data and mouse-clicks to predict a child, measure their behavior. Amazon and Facebook and Google and Microsoft and many other edtech companies are invading the classroom. Edtech companies like  DreamBox, Khan Academy, and Knewton use adaptive or “personalized” online programs that collect large amounts of data on each child.  Knewton claims 5- 10 million data points per child, per day.  DreamBox claims 50,000 data points per hour on each student. These  “Personalized” software programs embedded in education technology are collecting data about a student, secretly determining which questions students will see, measuring how fast a child reads, what he or she clicks on, how long he or she takes to answer a question. This meta data is sometimes being used to measure a child’s  “social emotional learning” and engagement. One assessment company, NWEA, measuring test item response times, says if a child responds to a test question too quickly, this will give him/her a low engagement score.  NWEA thinks a child’s rapid response means the child is guessing and this disengagement can be applied to other “deep rooted problems” in a student’s life such as,

“a student’s likelihood of disengaging on a test was associated with his or her self-management and self-regulation skills, the ability, for example, to show up for class prepared and on time. “As they disengage from tests and the course material, a whole host of other things come up … attendance, suspensions, course failure … that have been connected to risk of dropping out of school,”

In a digital environment, everything a child does online can be captured, connected and catalogued. The LearnSphere project funded by the National Science Foundation and handled by Carnegie Mellon, explains this project which began in 2014:

“There are several important initiatives designed to address these data access challenges, for individual researchers as well as institutions and states. LearnSphere, a cross-institutional community infrastructure project, aims to develop a large-scale open repository of rich education data by integrating data from its four components.[17] For instance, DataShop stores data from student interactions with online course materials, intelligent tutoring systems, virtual labs, and simulations, and DataStage stores data derived from online courses offered by Stanford UniversityClick-stream data stored in these repositories include thousands and even millions of data points per student, much of which is made publicly available to registered users who meet data privacy assurance criteria. On the other hand, MOOCdb and DiscourseDB, also components of LearnSphere, offer platforms for the extraction and representation of student MOOC data and textual data, respectively, surrounding student online learning interactions that are otherwise difficult to access or are highly fragmented. By integrating data held or processed through these different components, LearnSphere will create a large set of interconnected data that reflects most of a student’s experience in online learning.” http://www.sr.ithaka.org/publications/student-data-in-the-digital-era/

Shouldn’t parents be able to see and consent to this information being collected and analyzed about their children? Will researchers and edtech companies be granted MORE access to the personal student data held by theDataShop, that HR4174 creates? (Yes, according to the bill excerpts below.)

Personal information about a student is already shared to a state longitudinal database, SLDS. See here for what data elements are stored in the state data dictionary. The states share this personal student data (personally identifiable information, pii) with other agencies, corporations, researchers–again without parent notification or consent, and parents cannot opt out. See here for example of state agreements to share student pii with companies, researchers, agencies, etc.

The Department of Defense also has access to student data through the Federal Learning Registry is a joint student data gathering project between the Department of Defense and the Department of Education. The Learning Registry and US Department of Education are also “encouraging districts and states to move away from traditional textbooks” and instead use the Learning Registry’s openly-licensed online materials, (Online Educational Resources, OERs), facilitated by Amazon, Microsoft, Edmodo, ASCD, Creative Commons. Can parents see this data or opt out? Nope.

The safest way to protect data, is minimize its collection. HR4174 does not minimize data collection, nor does it decrease disclosures. Schools and student databases across the country are currently being hacked and held for ransom, students threatened by cyber terrorists. With the federal government’s track record of failing FITARA security scores,  and recent data breaches, the thought of the federal government coordinating and maintaining expanded access to state level student data is concerning.

History and mission of CEP Commission

HR4174 is a result of the CEP (Commission for Evidence-based Policy); as stated in the bill and in the CEP final report, its purpose is identifying and reducing or removing barriers to accessing state-level data. The CEP commission held several meetings and three public hearings.  I suggest you review the minutes, video and audio of these meetings and hearings. You can read about the history of the CEP commission, watch the first public hearing, see written testimony submitted here.

The testimony from Oct 21, 2016 CEP hearing panelists is enlightening:

 For example: RK Paleru of Booz Allen Hamilton’s testimony, said that BAH supports, among other things, linking student data from surveys and multiple agencies, public-private partnerships, and data analytics, and “bringing the private sector perspective to the conversation.” He also stated the need for a data clearinghouse to be self-service and like a “Pinterest for data“, or data as paid service, and wanted to promote inter-agency data sharing.

Another Oct 21 CEP hearing panelist, Rachel Zinn, Workforce Data Quality Campaign, WDQC, said because of the current ban on a federal student database, “stakeholders” don’t have access to student information, she goes on to say in order to link and share data, stakeholders often have to use “non-standard processes, often goes through personal relationships or particular capacities within agencies at particular times” .   

Panelists at Feb 9, 2017 CEP hearing (listen to Audio at 57 min to 1hr14min mark):

Panelists discuss making it easier to link personally identifiable information from IRS records and personal information from Census population survey, personal information from education records and SLDS. With the CEP Commission making this personal data more accessible, more available, the researcher feels “like a kid in candy store“.  There are great barriers that prevent researchers from getting this data, currently researchers have to get it by “hook or crook” or  “by leveraging personal relationships”… CEP questions the coercive nature of obtaining this data.  At 1hour 11 minutes, they discuss how currently they can link Census population survey data and personal IRS data, with persistence any academic researcher can access these data, you just have to know the steps to get there and I think that’s the Commission’s charge“…

The Feb 24, 2017 CEP meeting:

Again, panelists discuss how they are already linking personally identifiable state-level education records with IRS records, but cite it is difficult and barriers need to be removed to make it easier to link this pii data between agencies.

IRS and student data.jpg

CHINA and US: Meta data, predictive algorithms, analyzing and generating data, social engineering

Linking all this personal data on citizens reminds me of why I mentioned that China collects and links data about its citizens.  Is there anything in HR4174 that says personal data cannot be used to rank a person, create a reputation score, or profile a person? HR4174 allows meta data analysis, generation of new data that can be  used to predict and profile. Algorithms can be biased and wrong. HOW can you possibly police this? A good start would be Europe’s General Data Protection Rule.

Tech companies in the US are ramping up their use of predictive analytics, artificial intelligence, despite dire warnings of existential risk  . This article on Twitter, Facebook and Google analytics is a warning on why we should be concerned. Do Facebook and Google have control of their algorithms anymore? A sobering assessment and a warning,

““Google, Twitter, and Facebook have all regularly shifted the blame to algorithms when this happens, but the issue is that said companies write the algorithms, making them responsible for what they churn out.”

Algorithms can be gamed, algorithms can be trained on biased information, and algorithms can shield platforms [tech companies] from blame.”

YET, have you ever heard of Yet Analytics? To quote this article,  Yet, HP and the Future of Human Capital Analytics: AI and your reputation score,

“querying of big data comprising information on learning, economic and social factors and outcomes gathered by the World Bank, the World Economic Forum, the United Nations and elsewhere. The outcome is the ability to predict multi-year return on investment on a great variety of learning, economic and social measures. We knew that variables including adolescent fertility rates, infant mortality rates and the balance of trade goods all had significant relationships with GDP per capita.”

Microsoft of course uses artificial intelligence and analytics with Cortana technology, but also has MALMO built in the MINECRAFT platform, “How can we develop artificial intelligence that learns to make sense of complex environments? That learns from others, including humans, how to interact with the world? Project Malmo sets out to address these core research challenges, addressing them by integrating (deep) reinforcement learning, cognitive science, and many ideas from artificial intelligence.”  Microsoft also has PROJECT BRAINWAVE capturing real time artificial intelligence data.

Facebook and your credit score? Facebook reportedly has a patent for technology that could potentially be used for evaluating your credit risk, which they say could be used to view your social network connections and determine your credit worthiness.

Status of HR4174

HR4174 was introduced on 10/31/2017 and was passed on voice vote in the House Oversight and Government Reform.  Yesterday, the US House of Representatives suspended their rules, something that, according to this document, is only done on non-controversial bills. Judging by the public outcry and the rebuttal response from House Oversight, I would argue this bill is controversial and should not have been voted on suspended rule. With rules suspended and another voice votethe House unanimously passed HR4174 on 11/15/2017. Watch the vote, starting at 4hr 52min mark here.

Myth or Fact?  You decide.

myth or fact

The rebuttal

FACT:  Parents cannot opt students out of this state data collection that is obtained without consent.

HR4174 will increase access to this state-level student data, allowing data to be linked or disclosed with government agencies, researchers, again without consent.

  • If HR4174 does allow parental consent, does allow parents to opt out of student data collection and sharing, please correct me. It would be imperative to specifically state parental consent and opt out rights in the bill, so schools and parents are aware of this provision. There’s still time to add this opt out provision in the Senate.

FACT: HR4174 removes barriers to state-level data access and creates a National Secure Data Service (NSDS) with a Chief Evaluation Officer in each federal department; the NSDS will be coordinated through the Office of Management and Budget (OMB). Data officers in each agency oversee the dissemination and generation of data between state agencies and private users, contractors, researchers while finding new and innovative ways to use technology to improve data collection and use.

Does that sound like a national  system to manage and disclose data?  …Keep reading.

  • § 3520A. Chief Data Officer Council

“(a) Establishment.—There is established in the Office of Management and Budget a Chief Data Officer Council (in this section referred to as the ‘Council’).

“(b) Purpose and functions.—The Council shall—

“(1) establish Governmentwide best practices for the use, protection, dissemination, and generation of data;

“(2) promote and encourage data sharing agreements between agencies;

“(3) identify ways in which agencies can improve upon the production of evidence for use in policymaking;

“(4) consult with the public and engage with private users of Government data and other stakeholders on how to improve access to data assets of the Federal Government; and

“(5) identify and evaluate new technology solutions for improving the collection and use of data.

FACT: HR4174 requires each agency (see list of 17 different agencies, A-Q below, who will maintain and disclose data) and will make any data asset maintained by the agency available to any statistical agency. The head of each agency shall …make a list of data the agency intends to collect, use, or acquire. This data may be in an identifiable form and may include operating and financial data and information about businesses, tax-exempt organizations, and government entities. 

  • HR4174 PART D—ACCESS TO DATA FOR EVIDENCE

    § 3581. Presumption of accessibility for statistical agencies and units

    “(a) Accessibility of data assets.—The head of an agency shall, to the extent practicable, make any data asset maintained by the agency available, upon request, to any statistical agency or unit for purposes of developing evidence.

  • § 312. Agency evidence-building plan

    “(a) Requirement.—Not later than the first Monday in February of each year, the head of each agency shall submit to the Director and Congress a systematic plan for identifying and addressing policy questions relevant to the programs, policies, and regulations of the agency. Such plan shall be made available on the public website of the agency and shall cover at least a 4-year period beginning with the first fiscal year following the fiscal year in which the plan is submitted and published and contain the following:

    “(1) A list of policy-relevant questions for which the agency intends to develop evidence to support policymaking.

    “(2) A list of data the agency intends to collect, use, or acquire to facilitate the use of evidence in policymaking.

    “(3) A list of methods and analytical approaches that may be used to develop evidence to support policymaking.

    “(4) A list of any challenges to developing evidence to support policymaking, including any statutory or other restrictions to accessing relevant data.

Agencies involved in the HR4174 Federal evidence-building activities.

HR4174 “SUBCHAPTER II—FEDERAL EVIDENCE-BUILDING ACTIVITIES

§ 311. Definitions

“(1) AGENCY.—The term ‘agency’ means an agency referred to under section 901(b) of title 31.

901(b) of title 31 :
(b)
(1) The agencies referred to in subsection (a)(1) are the following:
(A) The Department of Agriculture.
(B) The Department of Commerce.
(C) The Department of Defense.
(D) The Department of Education.
(E) The Department of Energy.
(F) The Department of Health and Human Services.
(G) The Department of Homeland Security.
(H) The Department of Housing and Urban Development.
(I) The Department of the Interior.
(J) The Department of Justice.
(K) The Department of Labor.
(L) The Department of State.
(M) The Department of Transportation.
(N) The Department of the Treasury.
(O) The Department of Veterans Affairs.
(P) The Environmental Protection Agency.
(Q) The National Aeronautics and Space Administration.

https://www.law.cornell.edu/uscode/text/31/901

FACT: Data is shared between designated statistical agencies and can be personally identifiable data. Agencies and the Director can promulgate their own rules about data disclosure and sharing. The overseers of disseminating and generating can make their own rules.

  • “(c) Sharing of business data among Designated Statistical Agencies.—

    “(1) IN GENERAL.—A Designated Statistical Agency may provide business data in an identifiable form to another Designated Statistical Agency under the terms of a written agreement among the agencies sharing the business data that specifies—

    “(A) the business data to be shared;

    “(B) the statistical purposes for which the business data are to be used;

    “(C) the officers, employees, and agents authorized to examine the business data to be shared; and

    “(D) appropriate security procedures to safeguard the confidentiality of the business data.

 

  • “(e) Designated Statistical Agency defined.—In this section, the term ‘Designated Statistical Agency’ means each of the following:

    (1) The Census Bureau of the Department of Commerce.

    (2) The Bureau of Economic Analysis of the Department of Commerce.

    (3) The Bureau of Labor Statistics of the Department of Labor.”.

  • “(3) BUSINESS DATA.—The term ‘business data’ means operating and financial data and information about businesses, tax-exempt organizations, and government entities.  [Note: Schools are tax-exempt and government entities.]

 

  • “§ 3562. Coordination and oversight of policies“(a) In general.—The Director shall coordinate and oversee the confidentiality and disclosure policies established by this subchapter. The Director may promulgate rules or provide other guidance to ensure consistent interpretation of this subchapter by the affected agencies. The Director shall develop a process by which the Director designates agencies or organizational units as statistical agencies and units. The Director shall promulgate guidance to implement such process, which shall include specific criteria for such designation and methods by which the Director will ensure transparency in the process.
  • “(b) Agency rules.—Subject to subsection
  • (c), agencies may promulgate rules to implement this subchapter. Rules governing disclosures of information that are authorized by this subchapter shall be promulgated by the agency that originally collected the information.

FACT: Data is linked between agencies.

  • § 316. Advisory Committee on Data for Evidence Building  During the first year of the Advisory Committee, the Advisory Committee shall—

    “(B) evaluate and provide recommendations to the Director on the establishment of a shared service to facilitate data sharing, enable data linkage, and develop privacy enhancing techniques,

FACT: Data may be shared with private organizations, researchers, consultants, contractors, employees of contractors, government entities, individuals who agree in writing to comply with provisions.

  • “(e) Designation of agents.—A statistical agency or unit may designate agents, by contract or by entering into a special agreement containing the provisions required under section 3561(2) for treatment as an agent under that section, who may perform exclusively statistical activities, subject to the limitations and penalties described in this subchapter.

 

  • “(2) AGENT.—The term ‘agent’ means an individual

    “(A)(i) who is an employee of a private organization or a researcher affiliated with an institution of higher learning (including a person granted special sworn status by the Bureau of the Census under section 23(c) of title 13), and with whom a contract or other agreement is executed, on a temporary basis, by an executive agency to perform exclusively statistical activities under the control and supervision of an officer or employee of that agency;

    “(ii) who is working under the authority of a government entity with which a contract or other agreement is executed by an executive agency to perform exclusively statistical activities under the control of an officer or employee of that agency;

    “(iii) who is a self-employed researcher, a consultant, a contractor, or an employee of a contractor, and with whom a contract or other agreement is executed by an executive agency to perform a statistical activity under the control of an officer or employee of that agency; or

    “(iv) who is a contractor or an employee of a contractor, and who is engaged by the agency to design or maintain the systems for handling or storage of data received under this subchapter; and

    “(B) who agrees in writing to comply with all provisions of law that affect information acquired by that agency.

  • SEC. 202. OPEN Government Data.(a) Definitions.—
  • Section 3502 of title 44, United States Code, is amended—
  • “(15) the term ‘data’ means recorded information, regardless of form or the media on which the data is recorded;
  • “(16) the term ‘data asset’ means a collection of data elements or data sets that may be grouped together;
  • “(17) the term ‘machine-readable’, when used with respect to data, means data in a format that can be easily processed by a computer without human intervention while ensuring no semantic meaning is lost;
  • “(18) the term ‘metadata’ means structural or descriptive information about data such as content, format, source, rights, accuracy, provenance, frequency, periodicity, granularity, publisher or responsible party, contact information, method of collection, and other descriptions;

FACT: You are correct that HR4174 does repeal E–Government Act of 2002 (Public Law 107–34744 U.S.C. 3501 and re-insert it in title 44. However, the CIPSEA penalty of $250,000 fine or 5 years prison is not new; it has been in place since 2002. Student data has been collected and shared without consent since 2012-CIPSEA was not applicable or not enforced. Ironically, HR4174 weakens CIPSEA.

CIPSEA is amended to expand access to data. Additionally, once again, the Director can promulgate regulation on what data to share.

  • 3582. Expanding secure access to CIPSEA data assets

“(a) Statistical agency responsibilities.—To the extent practicable, each statistical agency or unit shall expand access to data assets of such agency or unit acquired or accessed under this subchapter to develop evidence while protecting such assets from inappropriate access and use, in accordance with the regulations promulgated under subsection (b).

“(b) Regulations for accessibility of nonpublic data assets.—The Director shall promulgate regulations, in accordance with applicable law, for statistical agencies and units to carry out the requirement under subsection (a). Such regulations shall include the following:

“(1) Standards for each statistical agency or unit to assess each data asset owned or accessed by the statistical agency or unit for purposes of categorizing the sensitivity level of each such asset and identifying the corresponding level of accessibility to each such asset. Such standards shall include—

“(A) common sensitivity levels and corresponding levels of accessibility that may be assigned to a data asset, including a requisite minimum and maximum number of sensitivity levels for each statistical agency or unit to use;

“(B) criteria for determining the sensitivity level and corresponding level of accessibility of each data asset; and

“(C) criteria for determining whether a less sensitive and more accessible version of a data asset can be produced.

“(2) Standards for each statistical agency or unit to improve access to a data asset pursuant to paragraph (1) or (3) by removing or obscuring information in such a manner that the identity of the data subject is less likely to be reasonably inferred by either direct or indirect means.

“(3) A requirement for each statistical agency or unit to conduct a comprehensive risk assessment of any data asset acquired or accessed under this subchapter prior to any public release of such asset, including standards for such comprehensive risk assessment and criteria for making a determination of whether to release the data.

Continually saying that you aren’t collecting new data is meaningless – because the data was illegally obtained in the first place. HR4174 allows personal data to be shared without consent and importantly, allows generated data, meta data analysis of citizens without consent.  Personal data belongs to the individual. Data collection without consent is theft. It’s time the US updated our privacy laws  – not to further weaken them. Instead, it’s time for Congress to be a leader: minimize the data collected, protect privacy and security,  and look to Europe’s General Data Protection Rule, the strictest privacy law in the world.

-Cheri Kiesecker

Living the Gig Life: Tom Vander Ark’s Plans for My 6th Grader

live the gig life_2

That my kid was a potential stepping stone to the introduction of the entrepreneurial spirt – so valued by those pushing the gig economy – into our public school and also a source of financial gain to boot, was a sobering.

It also raises big questions:

Where are the local protections to keep kids from being exploited?

Who owns their work and other personal data?

With the push for badges, internships, and other in-school workforce training, what happens to the child labor laws that made public education possible for so many kids?

For the first time ever, I got to attend a Network for Public Education Conference and participate on a panel.

During my presentation, one slide caught the attention of a reporter for EdSurge. The panel was called Parent Hopes and the Gig Economy.

Here’s the slide:

Future of Work K-12

This is what the reporter had to say:

In the session, Leith called out influencers such as Tom Vander Ark, a former education director with the Bill & Melinda Gates Foundation and one of the webinar’s presenters, saying he’s “now planning out” what her “sixth grader might be doing in the future for work.”

According to the talk, that future may be moving towards the gig economy, which the Bureau of Labor Statistics refers to as “a single project or task for which a worker is hired, often through a digital marketplace, to work on demand.” Leith offered another way of describing it: a “series of little jobs” people in the future will have to work “enough to make ends meet.” Leith also claims the workers themselves don’t handle the money, but rather, a platform like Uber—”the middleman”—does.

That concerns Leith. She and others think the gig economy will work for some people, but not all. For instance, research done in 2016 from data and consulting firm Hearts & Wallets suggested that gig economy workers in their 40s, 50s and early 60s within a certain subgroup had high satisfaction rates working in the gig economy. But critics have argued that the gig economy model doesn’t protect workers from exploitation.

Leith is also worried about the way advocates are advertising the gig economy to young students. She feels there’s a false “positive sell” that uses language such as “you’re gonna choose your gigs, and you’re a creative of the new economy.” A more accurate description, she said, would be to say that the gig economy relies on “low paying jobs” that won’t make it possible to “buy a house.”

I think my comments were interesting because the ed-tech “thought leaders” don’t spend much time with the parents of their designated end users.

Parents who aren’t thrilled their students are beta-testing products for free or their public schools are being transformed into workforce development pipelines.

That someone would dare to criticize Vander Ark was news in itself.

More Ed-Tech Adventures with my 6th Grader

Back in 5th grade, I refused to sign the release for my kid’s work to be uploaded to Seesaw. Seesaw compiles digital portfolios of student work and was started by two former employees of Facebook.

As you can imagine, taking such a hard anti-technology stance in the land of Bill Gates put me on the short list for President of the local Tinfoil Hat Society.

A year later the New York Times broke a story explaining how Seesaw is essentially bribing teachers to do product placement in their classrooms, and even worst, teachers are branding themselves and welcoming this commercialization of their profession.

Ms. Delzer also has a second calling. She is a schoolteacher with her own brand, Top Dog Teaching. Education start-ups like Seesaw give her their premium classroom technology as well as swag like T-shirts or freebies for the teachers who attend her workshops. She agrees to use their products in her classroom and give the companies feedback. And she recommends their wares to thousands of teachers who follow her on social media.

“I will embed it in my brand every day,” Ms. Delzer said of Seesaw. “I get to make it better.”

That my kid was a potential stepping stone to the introduction of the entrepreneurial spirt – so valued by those pushing the gig economy – into our public school and also a potential source of financial gain to boot, was a sobering.

It also raises big questions:

Where are the local protections to keep kids from being exploited?

Who owns their work and other personal data?

With the push for badges, internships, and other in-school workforce training, what happens to the child labor laws that made public education possible for so many kids?

Back to The Future of Work and What it Means for K-12 Schools Webinar

During the webinar, Michael Chui, Partner, McKinsey Global Institute, dropped this truth bomb.

In the future, will there be enough work? Historically, we have had enough work despite technology entering. I think in terms of actual amount of demand for labor, it’ll be there. But I do think there will be potential challenges in transitioning people from what they’re doing now as machines do some of what people do now into the new jobs of the future. Some of those things do have to do with retraining, re-skilling, even as people are in the workforce. And I think there’s a lot of work to be done in doing that successfully of scale. And I think that’s a challenge. And you also made reference to the fact that geographically in the United States, labor is at multi-decades low. Even that rate has declined over time. And if we’re going to have the outcomes, often times new jobs won’t be created in the same places where other jobs, you know, might be declining in employment. We’ll need to solve that challenge, as well, in terms of labor mobility. It’s been one of the things that has been, you know, underpinned, you know, good outcomes in the past in the economy and when we do — and then, one of the other challenges that we do see going forward, there’ll be enough, potentially enough work, we’ll need to transition people. And we also have a question as to whether or not the work will pay. Some of the modeling shows that, in fact, income polarization or inequality, whatever you want to call it. And Tom made reference to this before, a hollowing out of middle-aged jobs potentially could be exacerbated by technology, as well. 

Even enthusiastic supporters of the gig economy are worried about its negative impact on the the amount of jobs, living wages, and employment of individuals in their 40s, 50s, and early 60s.

Here’s the giant red flag, which Chui awkwardly admits: some of the modeling of the gig economy shows more income polarization and increased inequality.

Of course, common sense points to the very same thing, but it’s interesting the “thought leaders” are worried about it too.

This should scare everyone who wants their kids and grandchildren to have the opportunity to live in a fair and stable society.

-Carolyn Leith

 

 

 

 

 

 

 

USDoE’s Digital Promise and Facebook Team Up for Student Data Badges while the Gates Funded Data Quality Campaign is Lobbying Congress to Weaken FERPA, Again.

Original Title: USDoE’s Digital Promise and Facebook team up for student databadges. And Gates funded DQC group is lobbying Congress to weaken FERPA, again. Reposted with permission from  Missouri Education Watchdog.

Facebook Getting Smart

Now, onto the mega announcement made today on Tom Vander Arks’  Getting Smart blog, that Digital Promise is working with Facebook to develop student data badges.  We have written about student micro credentials (also called data badges) here and here and NEPC wrote about them here.  As for Digital Promise, we wrote about how Digital Promise is a nonprofit created by the US Department of Ed, they have a global arm and they promote Schools of Innovation, competency based ed, data badges, Relay Grad School to name a few.   So, this new announcement shouldn’t be a surprise; it will no doubt be a wonderful data collection and marketing tool for Facebook and the US Department of Ed, but it is incredibly alarming for students’ privacy and security.

Trick or treat Two-fer today.

The Data Quality Campaign, funded by Bill Gates is lobbying Congress to further weaken FERPA.  You can and SHOULD read all about that here.  We urge you to call or email your Congressman and Reps Todd Rokita (IN), Paul Mitchell (MI) to tell them NO.  Stop sharing students’ personal data with researchers and marketers, corporations and “nonprofits”  without parental consent. We need to fix FERPA, strengthen student data protection and privacy, not further weaken it.  Please do take the time to read this and send an email.   Thanks.

Now, onto the mega announcement made today on Tom Vander Arks’  Getting Smart blog, that Digital Promise is working with Facebook to develop student data badges.  We have written about student micro credentials (also called data badges) here and here and NEPC wrote about them here.  As for Digital Promise, we wrote about how Digital Promise is a nonprofit created by the US Department of Ed, they have a global arm and they promote Schools of Innovation, competency based ed, data badges, Relay Grad School to name a few.  So, this new announcement shouldn’t be a surprise; it will no doubt be a wonderful data collection and marketing tool for Facebook and the US Department of Ed, but it is incredibly alarming for students’ privacy and security.

We have reposted the getting smart announcement below.

October 30, 2017  By  getting smart staff

Digital skills are skyrocketing in demand, and that is a trend that will only continue to increase in impact. More than 8 in 10 middle-skill jobs (82%) require digital skills, and tech companies everywhere often have trouble finding candidates with the right know-how.

One recently announced effort to address this challenge that has us excited is Digital Promise’s partnership with Facebook, in which the two groups have collaborated to create a set of micro-credentials (a form of digital badges) focused on helping adults in the workforce learn these “middle” skills in the area of digital marketing.

We think that this new set of micro-credentials, the pursuit of which will include successive series of in-person workshops organized and implemented by local partners (Digital Promise will train organizations across the state of Michigan to deliver the workshops to their local communities starting in November), is a great way to address the challenge of reaching those who need this type of adult education the most.

Facebook has pledged to train 3,000 Michiganders in digital skills focused on social media over the next two years through these workshops. In the workshop, students will learn some of the basics of social media marketing, and have the opportunity to earn four micro-credentials that demonstrate the skills they have learned:

  • Social Media Marketing Basics
  • Marketing with Facebook Pages
  • Marketing with Facebook Ads
  • Marketing with Instagram

Over four weeks, students will develop a Facebook page and Instagram account for a local community organization or business of their choice; use that page to create awareness, drive traffic, and/or attract customers; and create advertising campaigns in support of that page. We think this approach is exactly the kind of authentic, real-world PBL that will encourage adults to seek these new skills.

In our recent analysis of adult entrepreneurship education (a big upcoming trend), we found that a lack of respected micro-credentials was one of the biggest missing components of entrepreneurship education. The program being developed by Digital Promise and Facebook appears set to provide a model for those looking to address this challenge. Our team is looking forward from hearing more from Digital Promise when we attend EdSurge Fusion later this week.

For more, see:

-Cheri Kiesecker

Robots Replacing Teachers? Laugh at Your Own Risk.

Reposted with permission from Save Maine Schools – Helping You Navigate Next-Gen Ed Reform.

Robots replacing teachers

Read their own documents, and you’ll see that they are planning to turn live, face-to-face teaching into a “premium service.”

A premium service.  

Meaning that they know face-to-face instruction is a better way to learn, and they have no intention of having their own children learn from machines.

*Disclaimer: the mother in this article requested to keep her identity anonymous for the time being. Additional details are forthcoming.

This fall, parents in a California school district discovered at a sixth grade open house that their child would no longer have a teacher.

Instead, the district had invested in an “exciting new way of learning” – a “personalized learning program” called Summit, designed by Facebook.

After listening to a presentation about the system that parents had received no prior information about (including no information about the programs data-sharing agreement, which gives Summit full authority to sell student information to third parties), they were ushered into a classroom where they told to log onto the software program.

When it became clear that no teacher was to be found, one mom went searching for an explanation.

“I went out into the  hallway and found a really young looking woman. She called herself the classroom facilitator, and told us that ‘teacher’ was just an old term.”

The mom’s jaw hit the floor.

Recently, an article has been circulating the web claiming that “inspirational robots” will begin replacing teachers in the next ten years.

Some have laughed it off, others have called it fear mongering.

One woman went so far as to call it “catastrophizing conspiracy horseshit.”

To these people I say: dismiss this at your own risk.

Those following education policy closely know that the only outrageous part of the headline is the use of the word “inspirational.”

While they may not look like this:

th0KCIT7PO

robots – in the form of data-mining software programs that operate under the Orwellian term “personalized learning” – are already invading our classrooms at lightning speed.

And if you think that what happened in California isn’t about to happen nationwide, check out this document from the high-profile, well-funded Knowledgeworks Foundation, which offers a menu of career opportunities for displaced teachers.

Proponents (who stand to make a boatload off the new system) claim that machine learning is an “inevitable” wave of the future; that it will “free up” teachers to do more “projects” with kids.

But that’s hogwash.

Read their own documents, and you’ll see that they are planning to turn live, face-to-face teaching into a “premium service.”

A premium service.  

Meaning that they know face-to-face instruction is a better way to learn, and they have no intention of having their own children learn from machines.

In that sense, maybe the idea of robots teaching children is “catastrophizing conspiracy horseshit,” if – and only if – you’re among the lucky few.

Save Maine Schools

Did you know Facebook and Summit Charter Schools Have Teamed Up to Deliver Personalized Learning?

Facebook Napalm Girl

It was a lucky shot, some say of Nick Ut’s famous Vietnam War photo The Terror of War, or Napalm Girl, as it is more commonly known. Less lucky, of course, was the little girl in the photo, Kim Phuc. She was running down the street, naked, after a napalm attack on her village. Her skin was melting off in strips. Her home was burning in the background. It was June 8, 1972. Ut was 21 years old. “When I pressed the button, I knew,” Ut says. “This picture will stop the war.” It has been 42 years since then. But that moment still consumes him.

In 1972, three years after the Tet Offensive, the Vietnam War had put President Nixon in a very tough spot during an election year.

For the first half of 1972, President Nixon made public overtures towards a formal peace agreement with North Vietnam.

After winning his re-election bid and the peace negotiations unravelling, President Nixon decided to change tactics.

During a meting with Henry Kissinger and Presidential military aide General Alexander Haig, the decision was made to bring in B-52 Bombers to escalate and up the intensity of the bombing campaign in North Vietnam.

As Alexander Haig put it, the goal of the bombing campaign was to “strike hard…and keep on striking until the enemy’s will was broken.”

Napalm Girl

On June 8, 1972, Associated Press photographer, Nick Ut, took a picture of a 9 year old girl running down the road after her village had been bombed with napalm. Her clothes had disintegrated, her skin scorched by the 2,200 degree burn of napalm.

Ut took the little girl to the hospital and demanded she be treated, despite being told by doctors that she had no chance.

Miraculously, Kim Phuc survived.

Many believe Ut’s photograph of Phuc helped end the Vietnam War.

It was a lucky shot, some say of Nick Ut’s famous Vietnam War photo The Terror of War, or Napalm Girl, as it is more commonly known. Less lucky, of course, was the little girl in the photo, Kim Phuc. She was running down the street, naked, after a napalm attack on her village. Her skin was melting off in strips. Her home was burning in the background. It was June 8, 1972. Ut was 21 years old. “When I pressed the button, I knew,” Ut says. “This picture will stop the war.” It has been 42 years since then. But that moment still consumes him.

Nick Ut’s photograph won the Pulitzer Prize. Kim Phuc and Ut forged a friendship that’s lasted for 45 years.

Facebook’s Censorship of Napalm Girl

In 2016, Norwegian author and journalist Tom Egeland posted on Facebook eight photos, one being Napalm Girl, as examples of how photography can change the world.

Facebook deleted Napalm Girl citing nudity concerns.

The Norwegian newspaper Dagsavisen contacted Kim Phuc for a comment on the censorship of the iconic photo. This is what she had to say:

“Kim is saddened by those who would focus on the nudity in the historic picture rather than the powerful message it conveys,” Anne Bayin, a spokesperson for the Kim Phuc Foundation, told the newspaper in a statement.

“She fully supports the documentary image taken by Nick Ut as a moment of truth that captures the horror of war and its effects on innocent victims,” she added.

When Tom Egeland posted a link to the Dagsavisen article, Facebook deleted it and suspended Egleland for 24 hours.

The controversy quickly spun out of control. How absurd was Facebook’s commitment to censorship and being the final arbitrator of what their users can see?

The Prime Minister of Norway, Erna Solberg, posted Naplam Girl to her account. Facebook deleted it. Solberg promptly encouraged her cabinet members to post the photo on their Facebook feeds. Half of them did.

In the end, Facebook finally backed down – not because they saw the error in their authoritarian censorship.

No way.

Rather, Facebook finally woke up from it’s my-way-or-the-highway brinkmanship to find itself engulfed in a firestorm of controversy which had reached such a fenzy the company faced a mini-insurrection of users and lots of bad press.

By Friday the internet saw a mini-insurrection, with defiant Facebook users sharing the photo in a protest against apparent ham-fisted censorship. Some 180,000 people used Facebook to view the Guardian’s account of the row – illustrated, paradoxically, with the same uncensored photo. Another 4,000 shared it on Facebook.

Facebook and Summit Charter Schools Team Up to Deliver Personalized Learning

Given Facebook’s perchance for censorship coupled with the company’s ability to control the content users see with proprietary algorithms, I’m shocked any parent would allow or want their kids to be taught online by a black-box, digital curriculum developed by Facebook.

But it’s happening, with the help of gushing, non-critical reporting like this piece from the New York Times:

But the Summit-Facebook system, called the “Summit Personalized Learning Platform,” is different.

The software gives students a full view of their academic responsibilities for the year in each class and breaks them down into customizable lesson modules they can tackle at their own pace. A student working on a science assignment, for example, may choose to create a project using video, text or audio files. Students may also work asynchronously, tackling different sections of the year’s work at the same time.

The system inverts the traditional teacher-led classroom hierarchy, requiring schools to provide intensive one-on-one mentoring and coaching to help each student adapt.

And this:

Mark Zuckerberg, Facebook’s chief executive, and his wife, Dr. Priscilla Chan, were the catalysts for the partnership. It is the couple’s most public education effort since 2010 when they provided $100 million to help overhaul public schools in Newark, a top-down effort that ran into a local opposition.

The Facebook-Summit partnership, by contrast, is more of a ground-up effort to create a national demand for student-driven learning in schools. Facebook announced its support for the system last September; the company declined to comment on how much it is spending on it. Early this month, Summit and Facebook opened the platform up to individual teachers who have not participated in Summit’s extensive on-site training program.

Summit is doing it’s part by offering a teacher residency program which focuses on training a new type of teacher: one who’s content to be the-guide-on-the-side while the Basecamp software does most of the actual teaching.

A network of charter schools in Northern California this month will launch the nation’s first teacher residency program focused on personalized learning.

Twenty-four teachers-in-training will be part of Summit Public Schools’ first Summit Learning Residency Program, which will train teachers to lead students in a personalized learning classroom setting, a hallmark of the Summit model.

And to cement their knowledge of the budding concept that tailors education to the individual, the residents themselves will also learn their coursework and receive their teaching credential through personalized learning.

Teachers if you don’t think the teaching profession is being downsized, this is your wake-up call.

The Inherit Racism of Summit Charter Schools

A few years back, this blog called out Summit’s racist practices. Summit’s recent team-up with Facebook doesn’t help to change our impression.

Censoring Napalm Girl is a deal breaker.

Racism is alway part of the mix and an unspoken justification for the United State’s expansion of empire – from Manifest Destiny to Vietnam. Times may change, but this old habit refuses to die.

Napalm Girl is part of our country’s unflattering past and if censored or left unacknowledged will continue to be repeated.

-Carolyn Leith