“What’s a computer?”

There’s this meme that keeps coming back on Twitter. A young person discovers a floppy disk and calls it the save icon. Apple is using the same idea with this ad. When the mum asks her daughter what she is doing on her computer, she answers “what’s a computer?”

via Apple’s new ad shows how iPads are going to replace laptops | TechCrunch

There’s no doubt that “computing” will continue to evolve from the way we interpret that action today (based on conventions that come from machines primarily from the 80’s but also the mainframes and typewriters that preceded them).

I’ve been using a Google Pixelbook for 99% of my “computing” over the last two weeks. I love the integration that this device has with the Android app store and being able to install apps like Microsoft Word or Excel or Powerpoint and use them in full screen as if I was on a Windows laptop. I also love being able to flip this device around into “tablet mode” and play racing games or browse Netflix using what were previously mobile apps. Combined with the Pixel Pen, this device has changed the way I think about my own workflow in a rapid fashion.

The iPad Pro can do that for many people (especially students but also “adults”) as well.

I’m a big fan of the show Westworld. It has incredible visual effects and a captivating story. But the technology used by characters on the show is what really draws me in (I know I know). The handheld “computing” devices they use with foldable screens, touch sensing, AI, and integration of mobile and laptop features is so attractive to me. I hope Apple / Google / Amazon / Microsoft or whatever company that is currently being bootstrapped in a young person’s garage apartment gets us there in the next decade.

We’re almost there with transitional devices like the iPad Pro or the Pixelbook.

 

How should we regulate Facebook and Google’s advertising platforms?

So how does Facebook’s ad system work? Well, just like Google, it’s accessed through a self-service platform that lets you target your audiences using Facebook data. And because Facebook knows an awful lot about its users, you can target those users with astounding precision. You want women, 30–34, with two kids who live in the suburbs? Piece of cake. Men, 18–21 with an interest in acid house music, cosplay, and scientology? Done! And just like Google, Facebook employed legions of algorithms which helped advertisers find their audiences, deliver their messaging, and optimize their results. A massive ecosystem of advertisers flocked to Facebook’s new platform, lured by what appeared to be the Holy Grail of their customer acquisition dreams: People Based Marketing!

via Lost Context: How Did We End Up Here? – NewCo Shift

I’m really torn on this one. John Battelle here (a tech publishing veteran who knows a good deal about online advertising) argues for more regulation and transparency of Facebook and Google’s advertising platforms.

I’ve seen how both Facebook and Google’s advertising platforms can work wonder for good causes like the nonprofits, religious group, and community organizations that are our clients. It’s wonderful to see the way that we can work miracles (hyperbole) to create new reach, fundraising, and awareness campaigns for these groups on a limited budget using Facebook Ads and AdWords. In the past, that would have required them to spend exponentially more on marketing and advertising. But now, we can help these groups grow on a shoestring. That’s a good thing.

However, we are at an inflection point.

I agree with Battelle on a theoretical layer, but there’s also the notion of democratic capitalism and the need to allow markets to flourish or wither based on their own actions (does our democracy value ethics, morality etc the same as it has and what does that mean for advertising?).

On the other hand, there are other advertising platforms that are major players in Asia and will be major players on a global scale soon such as Alibaba and Tencent and Rakuten. If we hamstring Google and Facebook, do we run the risk of advertisers abandoning those platforms for greener global pastures?

On the other hand, Russia interfered with our Presidential election and it’s no secret that politicians and special interest groups are doing bad things with these platforms.

The Way the Stems Meet the Curves

youtube_2017_wordmark_before_after

Technically, this is an absolutely fantastic update. They have taken the blunt shapes of the old letters and improved on all of them to create a beautiful wordmark. At small sizes the change is almost imperceptible but at larger sizes the change is a feast. If the way the stems meet the curves on the bottom of each letter doesn’t give you heart palpitations then you might be in the wrong industry. That is really masterful. Dork-swooning aside, every letter is better — better designed and better suited for every size and screen possible. Play a little game of Spot the Difference and you’ll appreciate what I mean. The wider opening of the “Y”, the rounder sides of the “o” and “e”, the contrast in thicks and thins. So good. Also, the kerning couldn’t be better.

via Brand New: New Logo for YouTube done In-house

Details. And kerning.

Don’t be boring or cheap with your logo or wordmark.

Voice Isn’t the Next Big Platform

This piece is originally from Dec 19, 2016, but interesting to revisit as we enter the home stretch of 2017 (and what a year it has been):

In 2017, we will start to see that change. After years of false starts, voice interface will finally creep into the mainstream as more people purchase voice-enabled speakers and other gadgets, and as the tech that powers voice starts to improve. By the following year, Gartner predicts that 30 percent of our interactions with technology will happen through conversations with smart machines.

via Voice Is the Next Big Platform, and Alexa Will Own It | WIRED

I have no doubt that we’ll all be using voice-driven computing on an ever increasing basis in the coming years. In our home, we have an Amazon Alexa, 4 Amazon Dots, and most rooms have Hue Smart Bulbs in the light fixtures (oh, and we have the Amazon Dash Wand in case we want to carry Alexa around with us…). I haven’t physically turned on a light in any of our rooms in months. That’s weird. It happened with the stealth of a technology that slowly but surely creeps into your life and rewires your brain the same way the first iPhone changed how I interact with the people I love. We even renamed all of our Alexa devices as “Computer” so that I can finally pretend I’m living on the Starship Enterprise. Once I have a holodeck, I’m never leaving the house.

And perhaps that’s the real trick to seeing this stealth revolution happen in front of our eyes and via our vocal cords… it’s not just voice-driving computing that is going to be the platform of the near future. In other words, voice won’t be the next big platform. There will be a combination of voice AND augmented reality AND artificial intelligence that will power how we communicate with ourselves, our homes, our environments, and the people we love (and perhaps don’t love). In twenty years, will my young son be typing onto a keyboard in the same way I’m doing to compose this post? In ten years, will my 10-year-old daughter be typing onto a keyboard to do her job or express herself?

I highly doubt both. Those computing processes will be driven by a relationship to a device representing an intelligence. Given that, as a species, we adapted to have relational interact with physical cues and vocal exchanges over the last 70 million years, I can’t imagine that a few decades of “typing” radically altered the way we prefer to communicate and exchange information. It’s the reason I’m not an advocate of teaching kids how to type (and I’m a ~90 wpm touch typist).

Voice combined with AI and AR (or whatever we end up calling it… “mixed reality” perhaps?) is the next big platform because these three will fuse into something the same way the web (as an experience) fused with personal computing to fuel the last big platform revolution.

I’m not sure Amazon will be the ultimate winner in the “next platform” wars that it is waging with Google (Google Assistant), Apple (Siri), Facebook (Messenger), and any number of messaging apps and startups that we haven’t heard of yet. However, our future platforms of choice will be very “human” in the same way we lovingly interact with the slab of metal and glass that we all carry around and do the majority of our computing on these days. It’s hard to imagine a world where computers are shrunk to the size of fibers in our clothing and become transparent characters that we interact with to perform whatever we’ll be performing, but the future does not involve a keyboard, a mouse, and a screen of light emitting diodes for most people (I hear you, gamers) and we’ll all see reality in even more differing ways than is currently possible as augmented reality quickly becomes mainstream in the same way that true mobile devices did after the iPhone.

Maybe I just watched too much Star Trek Next Generation.

Why augmented reality’s future is more practical and rational than you realize

Bryan Richardson, Android software engineer at stable|kernel, wants you to consider this: what if firefighters could wear a helmet that could essentially see through the walls, indicating the location of a person in distress? What if that device could detect the temperature of a wall? In the near future, the amount of information that will be available through a virtual scan of our immediate environment and projected through a practical, wearable device could be immense.

Source: The Technology Behind Pokémon Go: Why Augmented Reality is the Future

Call Pokemon Go silly / stupid / trendish / absurd etc. To a certain point the game is incredibly inane. However, it does illustrate the ability of memes and mass fads to still occur in large numbers despite the “fracturing” of broadcast media and the loss of hegemonic culture.

The more immediate question to me, though, is what to do with this newfound cultural zeitgeist around AR? Surely, there will be more copycat games that try to mirror what Pokemon Go, Nintendo, and Niantic have created. Some will be “better” than Pokemon Go. Some will be direct rip offs.

Tech behemoths such as Facebook, Microsoft, Samsung, HTC, and now Google understand the long term implications of AR and are all each working towards internal and public projects to make use of this old but new intense hope and buzz around the idea of using technology to augment our human realities. I say realities because we shouldn’t forget that we experience the world based on photons bouncing off of things and going into our eyeballs through a series of organic lenses that flip them upside down onto the theater screen that is our retina before the retina pushes them through the optic nerve to our frontal cortex where our electrochemical neurons attempt to derive or make meaning from the data and process that back down our spinal cord to the rest of our bodies… there’s lots of room for variations and subjectivity given that we’re all a little different biologically and chemically.

We’re going to see a fast-moving evolution of tools for professions such as physicians, firefighters, and engineers as well as applications in the military and in classrooms etc that will cause some people pause. That always happens whether the new technology is movable type or writing or books or computers or the web.

Games (and porn unfortunately) tend to push us ahead when it comes to these sorts of tech revolutions. That will certainly be the case in terms of augmented reality. Yes, Pokemon Go is silly and people playing it “should get a life.” But remember, the interactions with that game and each other that they are making now will improve the systems of the future and save / improve lives. Also… don’t get me started on what it means to “have a life” given our electrochemical clump of neurons that we all are operating from regardless of our views on objectivity, Jesus, or etiquette.

Pokemon Go snatches all of your Google data

Pokemon Go

By signing up to play Pokemon Go through Google, many iOS users have unknowingly exposed all of their emails, chats, calendars, documents and more to the game’s developer and third-parties.

Source: Pokemon Go catches all your Google data (here’s how to stop it) | Cult of Mac

I’ve been thinking a good deal about this game over the last few days. I should have posted before, but I wanted to wrap my head around the whole thing (as much as I can).

I’ll have a post up tomorrow with my thoughts.

Until then… this report is insanely terrible and horrifying given our current police state / insurance state / corporatist overlords. Our privacy is our power. Don’t give it away so easily, people.

Update

Fixed with new update on iOS.

All these problems may just be inevitable teething…

“We haven’t had this kind of transformation since television came in the late ‘40s and early ‘50s,” says Marc Pritchard, the marketing boss at Procter & Gamble, the world’s largest advertiser. Grappling with these challenges, however, may spur a shift in the industry’s structure. There will always be startups, particularly because technology changes so quickly. But on the whole, power is likely to move to fewer, larger companies.”

http://www.economist.com/news/business/21695388-worries-about-fraud-and-fragmentation-may-prompt-shake-out-crowded-online-ad?fsrc=scn/tw/te/pe/ed/invisibleadsphantomreaders

Reverse Engineering Humanity

5201

“We believe that a computer that can read and understand stories, can, if given enough example stories from a given culture, ‘reverse engineer’ the values tacitly held by the culture that produced them,” they write. “These values can be complete enough that they can align the values of an intelligent entity with humanity. In short, we hypothesise that an intelligent entity can learn what it means to be human by immersing itself in the stories it produces.

Source: Robots could learn human values by reading stories, research suggests | Books | The Guardian

Our stories are important. Our ability to have, interpret, and produce intuition is seemingly something very human. However, we’re finding out that’s not necessarily the case.

Blogging

“I keep remembering that, between Google Reader and its limits (items must have titles), and Twitter with its limits (only 140 chars, no titles, one link, no styling), same with Facebook (no links or styling) that my online writing has diminished dramatically, conforming to the contradictory limits of each of these systems.

I keep working on this, still am. Every day.”

Source: Blogging like it’s 1999 | Dave Winer

The Loss of Solitary Exploration

“This experimental feature helps voters make more informed choices, and levels the playing field for candidates to share ideas and positions on issues they may not have had a chance to address during the debate. By publishing long-form text, photos and videos throughout the debate, campaigns can now give extended responses, answer questions they didn’t get a chance to on stage, and rebut their opponents. As soon as the first debate begins at 7 p.m. ET on Thursday, search “Fox News debate” to find campaign responses.”

Source: Official Google Blog: New ways to stay informed about presidential politics

Wait, you mean the debates are scripted?

But seriously, this is interesting… as I’ve been watching the X Files revival this week (also on Fox™), I’ve been thinking more intentionally about the how’s and why’s we consume media in 2016 compared to, say, twenty years ago in 1996 when I was a nerdy teenager madly in love with the show. The X Files were something that I watched, recorded, and watched again most every week in order to parse out a new piece of the show’s ongoing mythology. It was a solitary, but incredibly beneficial, experience. I did the same with Beatles lyrics and Herman Hesse novels around the same time.

However, with this new iteration of the X Files, I’ve noticed that I’m watching my iPad as much as I’m watching the show. The #xfiles stream on Twitter has been an integral part of my viewing of the show. I only realized how much last night as I was watching the stream and realized that I had missed a key plot point that was subtle (I probably would’ve missed it if I had been watching the show intently rather than partitioning my attention, but still…) but was important. A tweet clued me in and I immediately “got it.” Would I have had that experience had I not been following the conversation on Twitter? Maybe. Hopefully in a second or third viewing I would. But I find myself not watching or reading things a second or third time these days because OMG JESSICA JONES is on Netflix and I have to catch up before diving into Making a Murderer before the next season of House of Cards!

Following the X Files last night was the last Democratic Presidential Debate before the Iowa Caucus next week. Again, I spent as much (if not more) time arguing with my friend Thomas Whitley about the merits of Bernie Sanders on Twitter as I did actually watching the debate. I’ve been watching presidential debates since … well, about 1996 when Clinton was at his high point and masterfully debated against a credible threat from Bob Dole. Throughout college and graduate school, I loved watching debates and can remember highlights from ’00 and ’04 as if they were fresh memories. Will I remember the ’16 debates (as remarkable as they are given the current political climate) as fondly or well? I’m not sure. I certainly don’t remember much about the ’12 debates when I was also using Twitter as a side show to further my “engagement with the conversation,” but there are also the variables of age and my diminished attention span to consider.

Perhaps that’s the fulcrum of whatever point I’m trying to make… as we grow older (I’m 37 now), do we intentionally seek out these side reels in order to persuade our minds that things like the X Files or a sporting event or a presidential debate are *really* important? Or do we seek these out as ways to validate our own confirmation bias about a particular football team or candidate (or mythology)?

I’ve noticed that when I read books on my Kindle, I frequently come across highlights that other Kindle users have made. It’s a neat feature for readers, as you get clued into what other readers have considered important or highlight-worthy in the same book you’re reading. It’s a feature that can be turned off, but I haven’t done that yet. I wonder what 17 year old Sam in 1996 would have said or thought of that feature when I was pouring through Siddhartha for the 3rd time? Would I have even made it through that many readings, since I would have had the highlights from other readers?

When I was a middle school teacher (I use that past tense slightly as I’m not sure one can ever divorce oneself from such an absurd calling / profession), I was always an enthusiastic promoter of the “back channel” in the classroom. The back channel, to me, was a space for students to openly raise questions and explore avenues during the course of a class experience. I experimented with various ways to bring about a healthy back channel, but I’m not sure if I ever did (I saw good benefits, but there was no way to quantifiably measure those outside of summative assessments which I also didn’t particularly enjoy). I wonder if I would encourage that back channel presence now, being a little older and with the benefit of hindsight? Did it detract from the class experience in the same way that my watching both the X Files on TV and on a screen detracts from my solitary exploration of thoughts and ideas? Or were there tangible benefits in the same way that I realized a plot point I would have probably missed last night?

I miss the days of having to watch a well worn VHS tape recording of a Star Trek TNG episode or The Empire Strikes Back or a Presidential Debate in order to make sure I didn’t miss anything, rather than just googling “last night’s X Files” to find the right subreddit to lose a few hours in. That’s unfair nostalgia (I’m getting old, remember). These tools, these social spaces, we’ve created are doing amazing things for our culture and society. I appreciate how Twitter and Reddit enrich my life.

But sometimes, I want to read Siddhartha again because as a pernicious 17 year old I hated the very idea and existence of Cliff Notes. Now, I can’t seem to experience anything without a cliff note version via 140 characters or a Virgil in the form of a polished Redditor.