Dear User/Supplier,

typewriter-1245894_1920

Greetings!

We are pleased to inform you that your contributions to the greatest research effort ever on the confluence of human and machine learning into a single all-powerful source of knowledge, wisdom and capitalist domination are of the utmost importance and always welcome!

Making the world a better place for all is a massive undertaking and we would be nowhere without the generous efforts of you, your family members, colleagues, friends, business partners, classmates, pets, auto mechanics, health care professionals, insurers and personal assistants (Hey, Alexa! Hi, Siri!) to stay connected, to share and spread the wealth of your lives with our systems and in our spaces. Every day you empower us to dig deeper into understanding much more than just purchasing patterns and streams of demand; you help us comprehend the complexity of human desire; the content of long and short range bucket lists. Because of you and your unfettered thirst for more choices, better selections and unlimited access, we are able to tell you more and more about

  • who you are
  • what you need
  • where you can get it
  • why you deserve it
  • and why anyone who would counter that can climb under a rock, no love lost!

As purveyors of this bright new future, we value every click, keystroke and swipe you make. These seemingly simple actions help us to unlock the secrets to true human understanding and the consolidated wealth of a steadily shrinking class of digital overlords and provide us with the fuel we need to build the bridges of enlightenment for all and for much less than we ever thought it would cost! Your participation is vital and essential and we want to make sure you know how grateful we are.

You are the future we’re building for.* We see you.

Ever in your service,

Tech’s Top DOGS (Digital Overlords Governing Surveillance) 2017

 

*Some restrictions may apply. Full benefits in limited supply. First come, first served. May the better man win (literally). It’s a dog eat dog world. There’s no such thing as a free lunch. Survival of the fittest, baby.  Smile pretty, we got you covered.

 

 

10,000 Characters

How many words might that add up to? How many pages? How many minutes of reading would that entail?

Like many other Twitter users I have some feelings about the lifting of the 140 character limit and potentially expanding it to up to 10,000. I had and have feelings about the shift from favorites as stars to hearts indicating a “like.”

I read the articles and posts describing Twitter’s downfall, death, corruption and fight for survival because this is the social media space that best meets my needs so far. And every time I feel myself about to say something sentimental about how and why I “care” about Twitter, I slap myself upside the head and remind myself that like hundreds of other corporations this is one more that is aiming to generate shareholder profits via my ongoing display of “care”: filling their platform with thousands of data points per hour.

And so it was with great relief that I read an article which made plain for me exactly what is at stake with Twitter lifting its signature 140 character limit. Will Oremus argues convincingly that it’s not about the length of the tweet:

What’s really changing here, then, is not the length of the tweet. It’s where that link at the bottom takes you when you click on it—or, rather, where it doesn’t take you. Instead of funneling traffic to blogs, news sites, and other sites around the Web, the “read more” button will keep you playing in Twitter’s own garden.

After a while, you may notice that this garden has expanded to take in territory that once lay beyond its walls—and that those walls are a little higher than you remember them being. Stories published on Twitter may not be available elsewhere. At the same time, Twitter might start to exercise some control over which stories available elsewhere will be allowed inside its garden.

The title of his post: “Twitter Isn’t Raising the 140 Character Limit. It’s Becoming A Walled Garden” says so much. And what it revealed to me was how much corporations are vested in guiding consumers in the “best way” to enjoy a service or product. Brand loyalty is even more important than ever. Attracting an audience or following is one thing, but to keep your audience tethered to your platform/service/product long enough for them to receive adequate ‘experience enhancers’ in the form of specifically targeted advertising; that is fully another.

And seen that way, I can’t believe that I have fallen in so deeply with all of this. How many terms of use have I knowingly accepted without so much as glancing at the details of my unique surrender? How widely and generously have I distributed my cookies among countless third parties?

So if Twitter changes its character limit, I have essentially all the same choices I have every day. To stay or go. To feed the insatiable monster or reduce my offerings. In truth, I’ve already become quite comfortable in my little garden space. Some things have begun to take root and grow, even thrive on some days. I appreciate the many neighborly interactions with other gardeners. And the wealth of our conversations is generated by the fact that none of us live in our Twitter gardens. We all come and go, check in and check back out. We bring our experiences from elsewhere and re-examine them back in the garden.

 

And yet, this garden with walls or without, is hardly built for permanence, although we like to behave as if that were the case. Twitter CEO, Jack Dorsey was quoted in a USA Today article describing the company’s logic in contemplating the change:

Dorsey has pledged to challenge long-held beliefs and conventions at Twitter in an attempt to reignite user growth.

“I’ve challenged our teams to look beyond assumptions about what makes Twitter the best play to share what’s happening. I’m confident our ideas will result in the service that’s far easier to understand and much more powerful,” Dorsey said during the company’s third-quarter conference call.

What struck me here was the idea of “long-held beliefs and conventions” in a company that is just over 10 years old. “Long-held beliefs” move fast in Silicon Valley and therefore for the rest of us, too, apparently. And “reignit[ing] user growth” is every company’s headache. That upward growth curve simply can’t go on forever the way it started. But that seems very hard to accept if you made so much money (or amassed so much attention) for a while there. If Silicon Valley insists that 5-8 years is time enough to have established “long-held beliefs” then none of us should be surprised when these same corporations begin to speak of “glory days” after 15 years in the market.

Understanding why companies do the things they do with us and supposedly for us has to become an additional priority in our digital day-to-day.  This pains me. I would really rather not bother. But there is too much at stake. How much have I already shared and surrendered? What happens if rather than introducing higher walls, bulldozers arrive and the Twitter garden is made over into a giant strip mall?

This is why we need to keep our eyes open. If you catch me saying that I “care” about Twitter – remind me that Twitter does not care much about me. Twitter cares about Twitter’s survival which now is only measured in economic terms, suitable for Wall Street exchanges. 140 or 10,000 characters of expression will neither provide the cure nor seal the demise.

Someday we’ll look back and laugh.

More Thoughts on Pasquale’s “Black Box Society”

When I finish reading a good book, my sense of satisfaction and fulfillment tends to be a rather private happening. I finish the book and even as I move on to the next (and there is always a next one), I still spend a fair amount of time processing the last. Since I’ve been blogging, I have used this space to share more thoughts about recent readings and that has felt somewhat liberating.

This summer I even went so far as to tweet out a picture of my proposed reading stack of 4 books:

https://twitter.com/edifiedlistener/status/619863567903256576

Americanah by Chimamanda Ngozi Aidiche, The New Jim Crow by Michelle Alexander, The Black Box Society by Frank Pasquale and Data and Goliath by Bruce Schneier. Three quarters of the way done, I find my mind twisting and turning to accommodate so much new and rich input. Only Data and Goliath remains and as a back-to-back read with The Black Box Society, I feel adequately steeled for whatever fresh insights on  data vulnerability it may bring.

Here I want to focus , however, on Black Box because I feel like I will find no peace until I have shared as much as possible while the ideas are still so active in my mind.  In an earlier blog post I noted parallels between Pasquale’s illustrations and the Harry Potter series. (Seriously.) In a nutshell, Black Box Society examines the role of algorithmic decision-making in the areas of reputation (how we appear to external parties), search (what we look for online and how the selection and ranking of responses takes place and may impact us), and finance (the business of making (much) more money out of some money). I read it because someone I deeply respect recommended it. Before I started I was already a little apprehensive.

While reading The New Jim Crow presented challenges in the form of emotional labor, I was concerned that Black Box Society might be a bit beyond me. I had reservations about my capacity to grasp all the topics author Frank Pasquale was planning to cover: intricacies of the tech industry and finance. I even wrote a sticky note to myself for a potential blog post: How to read a difficult text:
*go slowly
*talk back to your negative self-talk (that keeps saying you won’t get it)
*be patient
*allow not knowing
*come back to it again & again – build stamina over time

The sticky note is stuck just inside the front cover. As it turns out, however, I didn’t need it, per se. I made it through the text and felt well guided throughout. This was the first text in a long while that I read with pencil in hand. I underlined a lot and put notes in the margins. I got involved with the text and found unanticipated connections (i.e., to Harry Potter). And, I dare say, I had fun, even reading about finance because it was explained both generously and with significant intentionality. Particularly when the discussion turned to CDSes (credit default swaps), CDOs (collateral debt obligations) and MBSes (mortgage-backed securities) which stood at the center of the financial meltdown of 2008, Pasquale provided the necessary scaffolding for me to make sense both of the crisis itself and the underlying assumptions that made it possible.

As I read I kept coming back to thoughts about privilege, wealth and status. Whether describing the titans of Wall Street or Silicon Valley, Pasquale captures a very wealthy, white male demographic who wield an immense degree of power and influence in both the private sector as well as in government. And their ability to carry out so many of their transactions behind various cloaks of secrecy and complexity or “black boxes”, reinforces and expands the wealth and privileges this group continues to amass. As an African-American woman, an educator – I found myself reading and thinking that there are few who  expect me to read and be up on this stuff.  I find myself in this narrative as the clueless user/consumer who stands largely at the mercy of these gigantic corporate structures whose services I engage to write this post, to make it findable on the web, to purchase more books, to tweet more links, to tout my professional skills, connect with hundreds of other educators, and so on.

Frank Pasquale is extremely candid in his assessment of the current state of affairs:

What we do know is that those at the top will succeed further, thanks in large part to the reputation incurred by past success; those at the bottom are likely to endure cascading disadvantages. Despite the promises of freedom and self-determinism held out by the lords of the information age, black box methods are just as likely to entrench a digital aristocracy as to empower experts. (p. 218)

Think about that for a moment. “Those at the top will succeed further…those at the bottom are likely to endure cascading disadvantages.”
This captures our society with alarming accuracy. And we can be certain that black boxes abound, especially in areas where power is increasingly consolidated. I cannot help but think of the aggressive pursuit of corporate interests in K-12 and Higher Education where transparency and openness can quickly become casualties in the fight to “reform” public education through various forms of privatization. I must also consider the prison-industrial complex which provides shareholder billions as the United States has the highest rate of incarceration of its citizens in the developed world. As Michelle Alexander asserts in The New Jim Crow, the war on drugs has enabled the creation of a new social undercaste whose political, economic and social disenfranchisement underscore  the essence  of “cascading disadvantages.”

While reading Black Box Society I was frequently reminded of an essay by sociologist,  Tressie McMillam Cottam whom I quoted in a previous blog post:

…give up on computers and get up on politics. Computers can be fine. Computers are politics. Personalized learning may be fine. Personalized learning is politics. Apps are fine. Apps are politics. Tech is politics. Tech is politics. Tech is politics.

There seems to be no escape from the political no matter where I turn.  Completing The Black Box Society becomes a political act,  as is reading The New Jim Crow and Data and Goliath. This is me “getting up on politics.” Getting informed, adding depth to my otherwise fuzzy notions of impending social and economic demise. It is impossible to read Pasquale and not become politicized.  He writes:

Internet and finance firms “set the standard” for our information economy. So far they have used their powers to know the world of commerce ever more intimately…Knowing more than a rival, or simply knowing it faster, is the key to vast fortunes.

But what if economic success were based less on information advantage and more on genuine productivity? Distracted from substantive judgments on what the economy should produce, we have been seduced by the mysterious valuations that Wall Street and Silicon Valley place on goods and services. But their algorithmic methods framed as neutral and objective, are predictably biased toward reinforcing certain hierarchies of wealth and attention. (p. 187-88)

The choices we have become very narrow very quickly unless we take steps at the very least to understand the evils to which we appear to be wedded. My attempts to comprehend the scope of  algorithmic dominance in our information economy seem to have been a wise and useful step. Following @FrankPasquale on Twitter has also broadened my perspective on related topics. This is not about learning the ins and outs of  a subject area. Rather this is about opening my eyes to what is unfolding right in front of me and has a daily impact on how we function as a society. We can’t see everything at once. But we can train our eyes on a specific field for a time in order to gain perspective, insight, and cause for further observation.

Frank Pasquale, The  Black Box Society: The Secret Algorithms That Control Money and Information. Cambridge, MA: Harvard University Press 2015.

 

 

In Deep Water With Audrey and Tressie

As an educator there are plenty of reasons to be on Twitter or to engage on other social media platforms. I’m a PE teacher finishing up a year’s hiatus from the classroom and looking forward to getting back into the routine of working with real children.

That said, my intellectual excursions this year have taken me far beyond my classroom and the practice of teaching. Through extensive and very eclectic reading I’ve ventured into territories that may or may not have to do with education directly. What has happened is that my choices have become more political. In the opinions I seek, the analyses I read, the topics addressed reflect a deliberately more politicized interest. So when I do read about K-12 classroom practice or recent trends in ed-tech for instance, a filter I have added is political perspective – where is the author coming from? What factors may be contributing to this person’s take on the subject? How might this person’s perspective change and influence mine? What I have found is that reading in areas where I feel to some extent “out of my depth” has worked wonders in allowing me to zero in on what my core beliefs and concerns are when it comes to education.

Two authors who regularly challenge me to start treading in the deep end of my beliefs about education are Audrey Watters and Tressie McMillan Cottam. This week they appear to have double teamed on the intersecting topics of technology, education, markets and privacy.
First, Audrey goes to town with this talk given at a panel at the International Society of Technology and Education (ISTE) conference last week: Is It Time To Give Up On Computers in Schools?
Provocative? Yes, quite and by design. Her talk was published on hybridpedagogy.com. She says:

Sure, there are subversive features of the computer; but I think the computer’s features also involve neoliberalism, late stage capitalism, imperialism, libertarianism, and environmental destruction. They now involve high stakes investment by the global 1% — it’s going to be a $60 billion market by 2018, we’re told. Computers involve the systematic de-funding and dismantling of a public school system and a devaluation of human labor. They involve the consolidation of corporate and governmental power. They are designed by white men for white men. They involve scientific management. They involve widespread surveillance and, for many students, a more efficient school-to-prison pipeline —

Further she suggests:

We gaze glassy-eyed at the new features in the latest hardware and software — it’s always about the latest app, and yet we know there’s nothing new there; instead we must stare critically at the belief systems that are embedded in these tools.

It happens often when I read Audrey’s work that I am called to attention in a visceral way. Her tone is not alarmist, yet her message is alarming if you dare to sit with the implications of all that she is saying. She speaks to a much deeper question than “should I use Firefox instead of Chrome?” (Which is where many K-12 tech conversations are happening) Rather, she asserts that our homegrown brands of social and economic inequalities are not only baked into the tools we use but likely reinforce and exacerbate them.

If we want schools to be democratizing, then we need to stop and consider how computers are likely to entrench the very opposite. Unless we stop them.

Then I came across Tressie McMillan Cottam’s remarks prepared for a recently held panel discussion: “New Topics in Social Computing: Data and Education.”
Tressie is a sociologist who, in my mind, has moved mountains in the area of public scholarship. Her high profile Twitter account has helped promote the visibility of accessible scholarly writing happening both within and outside the academy. Delving into the broad area of “Data and Education” she asks the reader to get clear with what we mean by “privacy” in this context:

What if privacy is euphemism for individualism, the politically correct cousin of rational actor theories that drive markets that is fundamentally at odds with even the idea of school as a public good? If that is possible (and, I of course, think it is not only possible but the case at hand), then how can we talk about students’ privacy while preserving the integrity of data to observe and measure inequality? I suppose that is where I am on current debates about privacy and data in K-12: are we talking about everyone’s privacy or are we talking about new ways to mask injustice? Do you get to a Brown v. Board when schools that are also businesses own school data? I suspect not, because the rules governing data are different in markets than they are in public trusts.

To grasp what we are dealing with means that we will have to unpack our firmly held beliefs about what is at stake:

I question the assumptions about privacy that seem to be the only way we currently have to talk about how deeply enmeshed schools are in markets. Can we talk about privacy in a way that is about justice rather than individualism? If we cannot then privacy may be as big a threat to students as data mining because they are two heads of the same beast.

In agreeing with Audrey’s call to rid our schools of computers she remarks:

I would add: give up on computers and get up on politics. Computers can be fine. Computers are politics. Personalized learning may be fine. Personalized learning is politics. Apps are fine. Apps are politics. Tech is politics. Tech is politics. Tech is politics. Unless and until that is the conversation, then tech is most likely a politics at odd with my own.

So there’s that political thing: connecting the things I do, use, and promote to their effect on me, on others, our our collective existence and making decisions about my actions based on the outcomes I say I want. If I say I want a more just world, what am I doing to support and promote that? How does it show in my voting behavior, in my media consumption, in the way I choose to raise and educate my children, in the friends I keep, in the organizations I endorse and those I decry? Those are political questions, just as they can be deeply existential questions. The choices I make as an individual do not happen in a vacuum. They occur and have implications in and for my surroundings and also express views and beliefs that relate to those surroundings. This why reading Audrey Watters and Tressie McMillan Cottam has become so important for me. Both point to intersection after intersection where individual decisions collide or overlap with societal assumptions and outcomes.

It’s dizzying and disorienting to do this kind of reading on a regular basis. Feeling “out of my depth” comes at a price. I finally understand that smh is shorthand for ‘shaking my head’, but often I am too bewildered to do even that. Being confronted with how much I don’t know is not nearly as trying and uncomfortable as recognizing how little thought I have given to some very central facets of my daily existence. Tressie and Audrey take me there and what I choose to do with these fresh insights is entirely up to me. I feel like I may be getting a little wiser, gaining a bit more nuance in my political views, stretching my critical thinking muscles a little further.

Tressie’s concluding sentences trigger a peculiar response in me: I think about weightlifting:

 I believe education is a human right when education is broadly defined as the right to know and be. Period. I believe schooling can still do education but it cannot do it and be a market. Information symmetry is at odds with most market relationships and schools have to be about information symmetrically produced, accessed and imagined. Schools can be valuable to markets without becoming them. I believe there is such a thing as a social category that subsumes markets to societies. I believe those are political choices and only effected by social action. 

“Schools can be valuable to markets without becoming them.” That feels to me as though a weight has been lifted – off of my shoulders, somehow. There’s that blessed moment of recognition: “yeah, that’s what I wanted to say.” So there’s some comfort.

At the same time, “schooling can still do education but it cannot do it and be a market” which is where so much neoliberal rhetoric and policy is leading us: to education systems as markets -There’s the weight bearing down on me, on us; the likelihood of freeing ourselves shrinking before our eyes. Unless of course we wake up and see that we in fact have choices. We can lift the weight. We needn’t simply succumb to it because it’s heavy and makes us incredibly drowsy.

Audrey and Tressie are here to wake us up. And K-12 educators, this is a conversation we need to be in on. Not only listening but dialoguing. This is how we build critical thinking into our curricula and lesson plans: we do it ourselves. Regularly. We wade into the deep waters and have our beliefs challenged. Readings like these provide necessary starting points.

A Programmable Future

CC pixabay.com

CC pixabay.com

I experienced a rare moment this week. I read a post and quite simply it changed me.

The post helped me see what I was not seeing.

To recognize what I have been avoiding.

To be brave when my fear is the only audible voice I can hear.

The post  I read was  “The Future of Education: Programmed or Programmable “ by Audrey Watters. It is in fact the transcript of a talk she recently gave at Pepperdine University. I encourage you to read the full text to appreciate the strength and wisdom of her arguments.

The first point that got under my skin was this:

Whether it’s in a textbook or in a video-taped lecture, it’s long been the content that matters most in school. The content is central. It’s what you go to school to be exposed to. Content. The student must study it, comprehend it, and demonstrate that in turn for the teacher. That is what we expect an education to do, to be: the acquisition of content which becomes transmogrified into knowledge…

…despite all the potential to do things differently with computers and with the Internet and with ubiquitous digital information, school still puts content in the center. Content, once delivered by or mediated through a teacher or a textbook, now is delivered via various computer technologies.

YES! Content is always at the center, of course.  And what have I been working so hard to cultivate in the learning episodes that I design for others? Experience.  I want my clients, participants, students, athletes to experience something, to feel something and thereby come to know “the thing” and what it may mean for them. Content has been a vehicle but my real desire has always been to generate feelings, emotions, connection – the stuff that makes you feel alive. How very counter-cultural I now understand.

Audrey Watters goes on to talk about shifting away from the content-centered approach of the “programmed web” and towards the more open and co-constructed “programmable web:”

The readable, writable, programmable Web is so significant because, in part, it allows us to break from programmed instruction. That is, we needn’t all simply be on the receiving end of some computer-mediated instruction, some teacher-engineering. We can construct and create and connect for ourselves. And that means that — ideally — we can move beyond the technologies that deliver content more efficiently, more widely. It means too we can rethink “content” and “information” and “knowledge” — what it means to deliver or consume those things, alongside what it makes to build and control those things.

This is about where things started to heat up for me. The next sentence laid my purpose out for me like the Tarot card you knew was coming before you even approached the table:

One of the most powerful things that you can do on the Web is to be a node in a network of learners, and to do so most fully and radically, I dare say, you must own your own domain.

WHAT?

As I read on, two things were happening: my emotions had gotten hold of the stage and were running with it. At the same time, my rational mind tore further into the text looking for something to save me fast.

Authority, expertise, participation, voice — these can be so different on the programmable web; not so with programmed instruction.

The Domain of One’s Own initiative at University of Mary Washington purposefully invokes Virginia Woolf’s A Room of One’s Own: “A woman must have money, and a room of her own, if she is to write fiction.” That is, one needs a space — a safe space that one controls — in order to do be intellectually productive.

Boom!

We have an amazing opportunity here. We need to recognize and reconcile that, for starters, in the content that programmed instruction — as with all instruction — delivers, there is a hidden curriculum nestled in there as well. Education — formal institutions of schooling — are very much about power, prestige, and control. [emphasis mine]

and then this:

Despite all the talk about “leveling the playing field” and disrupting old, powerful institutions, the Web replicates many pre-existing inequalities; it exacerbates others; it creates new ones. I think we have to work much harder to make the Web live up to the rhetoric of freedom and equality. That’s a political effort, not simply a technological one.

That’s when the tears came rolling in. Between the deep desire to be that “node in a network of learners” and the self-unhelpful stance of “I could never do that.” (in this case  to have, run and maintain my own domain.), a larger truth was revealed:  I am at liberty to make use of my own superpowers. I am a learner of outrageous potential. There is no reason to believe that I cannot do what no one expects.  That’s when all the forces, internal and external, technological and philosophical which have  kept the volume of my fears turned all the way up seemed suddenly muted.

I’ve been sitting with this experience for a few days now. I wrote to Audrey almost immediately to say Thank you and at the same time nearly wanting to ask for the antidote.  Because it is a fundamentally scary experience to be exposed to your own potential and grant it some credibility. And when you belong to a marginalized group, that exposure can be all the more astounding and confounding. Empowerment can feel like work because it is not for free. Empowerment always challenges us to imagine, to create, to put into practice what once appeared impossible.