INSIDE ADIDAS’ ROBOT-POWERED, ON-DEMAND SNEAKER FACTORY


LAST WINTER, THE sportswear giant Adidas opened a pop-up store inside a Berlin shopping mall. The boutique was part of a corporate experiment called Storefactory—a name as flatly self-­explanatory as it is consistent with the convention of German compound nouns. It offered a single product: machine-­knit merino wool sweaters, made to order on the spot. Customers stepped up for body scans inside the showroom and then worked with an employee to design their own bespoke pullovers. The sweaters, which cost the equivalent of about $250 apiece, then materialized behind a glass wall in a matter of hours.

The miniature factory behind the glass, which consisted mainly of three industrial knitting machines spitting forth sweaters like dot-matrix printouts, could reportedly produce only 10 garments a day. But the point of the experiment wasn’t to rack up sales numbers. It was to gauge customer enthusiasm for a set of concepts that the company has lately become invested in: digital design; localized, automated manufacturing; and personalized products.

Storefactory was just a small test of these ideas; much bigger experiments were already under way. In late 2015, Adidas had opened a brand-new, heavily automated manufacturing facility in Ansbach, Germany, about 35 miles from its corporate headquarters. Called Speedfactory, the facility would pair a small human workforce with technologies including 3-D printing, robotic arms, and computerized knitting to make running shoes—items that are more typically mass-produced by workers in far-off countries like China, Indonesia, and Vietnam. The factory would cater directly to the European market, with digital designs that could be tweaked ad infinitum and robots that could seamlessly transmute them into footwear customized to the shifting preferences of Continental sneakerheads. By placing factories closer to consumers, Adidas could ostensibly leapfrog over shipping delays and expenses. “What we enable is speed,” said Gerd Manz, vice president of Adidas’ innovation group. “We can react to consumer needs within days.”

Speedfactory, Adidas claimed, was “reinventing manufacturing.” Media reports were no less grand. “By bringing production home,” wrote The Economist, “this factory is out to reinvent an industry.”

In September 2016, the first pair of Speedfactory sneakers came off the line: a very-limited-­edition running shoe called Futurecraft M.F.G. (Made for Germany). To hype its release, the company put out a 3-­minute teaser video highlighting not just the shoe but its manufacturing process. A suspenseful, intense electronic soundtrack set the mood for a series of futuristic close-ups: dusty white residue on a computer keyboard, various digital control panels, an orange robotic arm sliding into action. When Adidas released 500 pairs of the Futurecraft M.F.G. in Berlin, ­people camped out on the street to buy them, and the sneakers sold out almost instantly.


A wall of fabric allows for experimentation at a “Maker­Lab” inside Adidas HQ. ÉRIVER HIJANO
Alongside its unveiling of the Futurecraft M.F.G., Adidas made another big announcement: It would soon be building a second Speedfactory—in Atlanta. The future of manufacturing was coming to America too.

This October, the company announced a project called AM4—Adidas Made For—a series of sneakers that would be designed with input from various “running influencers,” ostensibly tailored to the needs of specific cities. The shoes are said to be designed around the unique local challenges runners face: in London, apparently, many runners commute by foot; they need sneakers with high visibility for dark nights and rainy days. New York City is constantly under construction and is organized in a grid, so runners need a shoe that can deftly handle multiple 90-degree corners. Los Angeles is hot and by the ocean. In Shanghai, preliminary research suggested that people primarily exercise indoors. All AM4 shoes would be made in the company’s two Speedfactories and released in limited editions.

At some point I became a bit mystified by all of this. It struck me that most decent running shoes on the market could probably handle Manhattan’s grid. And if a selling point of the Speedfactory was expedited time to market, why use it to manufacture shoes that would have to travel from Germany to China? (The ultimate aspiration is to open Speedfactories in many more regions, but not right away.)

The factory feeds into the jittery discourse about automation replacing human workers.

It seemed clear that the Speedfactory concept fit into a larger economic narrative; I just wasn’t sure which one. Adidas was not alone in betting on the importance of customization; practically every major consulting company—McKinsey, Bain & Company, Deloitte—has issued a do-or-die report in recent years about how “mass personalization” is the wave of the future. And in glancing ways, Speedfactory simultaneously delivered on the dream of distributed manufacturing that the era of 3-D printing was supposed to usher in, and on Donald Trump’s seemingly hallucinatory campaign promise that factory jobs would return to America. Stories about the factory’s reliance on robots also fed into the jittery discourse around automation replacing human work.

The cynical side of me wondered if perhaps the Speedfactory was an elaborate, expensive branding exercise. As with so many new ideas in our current age of innovation, I couldn’t determine whether the rhetoric surrounding the Speedfactory was deeply optimistic or deeply cynical. I was especially curious about what it might mean for America. But the Atlanta factory had not yet opened. So I went to visit the ur-Speedfactory in Ansbach—effectively its twin. To learn about the future of manufacturing in the American South, I needed to travel approximately 5,800 miles to a cornfield in the middle of Bavaria.

The first Speedfactory, in Ansbach, Germany. A second is set to open in Atlanta. ÉRIVER HIJANO

ADIDAS’ HEADQUARTERS IS stationed in Herzogen­aurach, a town of 22,000 just outside of Nuremberg whose claim to fame is that it is home to both Adidas and Puma. The competing sportswear companies were founded by brothers Adolf (Adi) and Rudolf Dassler, rumored to have had a falling out while taking cover in a bunker during World War II. For a time, their rivalry supposedly divided residents; Herzogen­aurach was nicknamed “the town of bent necks,” due to the local habit of entering conversation by peering at the feet of one’s interlocutor in order to identify their corporate and social affiliations.

This was not a problem on Adidas’ campus, where affiliation was unambiguous: Everyone in sight was wearing sneakers made by their employer. The campus, dubbed the World of Sports, occupies a sprawling 146-acre former Nazi air base that corporate communications understandably prefers to describe as an old US military station. (After being commandeered by the US Army in 1945, the base was returned to the German government in 1992 and was acquired by Adidas five years later.) Some of the original barracks still stand and have been repurposed as office space. They cut an odd silhouette next to a glass-enclosed cafeteria named Stripes and a mirrored, angular office building named Laces that looks like a high-design airport terminal. Inside Laces, glass walkways crisscross elegantly from side to side, as if pulled through the eyes of a shoe.

The campus holds a full-size soccer pitch, a track, a boxing room, and an outdoor climbing wall. There are multiple outdoor courts for beach volleyball, basketball, and tennis, and employees actually use them. When I visited in early July, small packs of well-shod workers trotted diligently across the campus, threading through sidewalks and toward forest trails. Nearly everyone, on and off the courts, was wearing Adidas apparel along with their sneakers. Disc-like robotic lawnmowers rolled through the grass, munching slowly. Though I am predisposed, as an American Jew descended from Holocaust survivors, to be slightly uneasy at a former Luftwaffe base populated by several thousand well-behaved young people with unifying insignias, the campus had an energetic, spirited vibe. The employees, who hail from all over the world, seemed healthy and happy. It all felt a bit like what you’d imagine if The Nutcracker had been set in a Foot Locker.

Adidas’ German headquarters felt a bit like a production of The Nutcracker set inside a Foot Locker.

Compared with the World of Sports, the Speedfactory—an hour-long bus ride from headquarters—is a relatively featureless box. It is housed in a white office building in the middle of the aforementioned cornfield; the exterior is marked with Adidas flags and the logo of Oechsler Motion, a longtime manufacturing partner, which operates the facility. I went there with a small group of other visitors for a tour. In a carpeted foyer, we pulled on heavy rubber toe caps, a protective measure. Liability thus limited, we traveled down the hallway toward the back of the building and shuffled inside.

The factory was white and bright, about the size of a Home Depot, with high ceilings and no windows. There weren’t many people, though there weren’t that many machines either. Along an assembly line made of three segments, an engineered knit fabric was laser-cut (by robots), shaped and sewn (by humans), and fused into soles (a collaborative, multistep, human-and-machine process). At the far end of the room, an orange robotic arm, perched high on a pedestal atop a particle foam machine, moved in a majestic, elegant, preprogram­med sweep.

The raw components of the sneakers being produced inside the Speedfactory were minimal: rolls of engineered knit fabric; finger-wide strips of semi-rigid thermoplastic polyurethane, which fuse to the exterior of a shoe to give it structure; white granules of thermoplastic polyurethane for Adidas’ signature Boost soles; an orange neon liner imported from Italy; and a “floating torsion bar,” purportedly for increased support, that looked like a double-headed intrauterine device.

A worker whistled as he placed oddly shaped, laser-cut flaps of the knit fabric onto a conveyor belt. They looked a bit like Darth Vader’s helmet in silhouette. The conveyor belt glided them through white, cubelike cases with tinted glass, where a machine heat-fused the strips of thermoplastic polyurethane onto the fabric in a precise pattern. A factory worker riding a white forklift rolled slowly past.

Another worker passed the flaps of fabric back to a line of sewing machines operated by humans, who stitched them together to form three-­dimensional little booties—the uppers of the sneakers. These were then stretched by an additional factory worker over a contraption that bore two model feet, as if a mannequin had been lying on its back, playing airplane. The feet were then detached—also by a human—and placed into a large, glass-doored machine. In what can only be described as a genuinely dramatic 93 seconds, the door to the machine slid shut, a hot light flared up from behind the bootie-clad feet, and the knit uppers fused to a pair of soles. In traditional shoe factories, this process generally involves a messy and imprecise feat of gluing, performed by the dexterous hands of warm-blooded people. Here, it was done by what looked like a neo-futuristic Easy-Bake Oven. Later, another human would thread the shoelaces.

The whole process was mesmerizing. As I leaned against the window of the bus back to Nuremberg, I realized that I hadn’t thought about the Second World War for at least five hours, a personal best for my time in Germany.


A motion-capture system collects data on an Adidas shoe. ÉRIVER HIJANO
SPEEDFACTORY AND STOREFACTORY are both the brainchildren of a division within Adidas that is focused on new technologies called the Future team—a kind of Google X for sneakerheads. The division is small—some 120 people on a campus of 5,000—and its definition of the future is modest: just two to seven years out. “We are like a little company within the company,” a tall, gregarious employee named Klaus told me. As he gesticulated toward the glass doors to the Future team’s offices, which are at the back of Laces, his voice had the breathless tenor of a whisper without being quiet; everything he said sounded like it could be followed by a magic trick. “We try to push our company: Come on, get off your lazy ass, go into a new area.”

Take Storefactory, for example: Klaus described how the idea could scale globally. A user (“I hate the word consumer,” he sighed) could take a body scan once, then order custom clothing to be delivered anywhere in the world. “The future will become so much more versatile and free,” he said.

In the center of the Future team office, a sneaker dangled from the grasp of a small industrial robotic arm, called the LBR iiwa, made by the German automation company KUKA. Engineers were experimenting with ways it might be used in a Speedfactory. Designed for lightweight, intricate assembly work, the arm is sensitive and responsive to touch. It is curved and sleek, like something out of a Pixar movie, or a sex toy.

Some Future team engineers offered to let me teach the iiwa a motion by guiding it with my own hands. I cautiously swirled the arm in a figure-eight and waited for the robot to repeat the gesture. But it remained motionless; the sneaker hung limply. One of the engineers furrowed his brow and tapped at the control panel. I asked what role they thought the arm could play in a Speedfactory. Like many questions posed to the Future team, the answer to this was either top secret or as yet undetermined. “You can make a shoe with totally different materials if you have a robot that can wrap wire around it,” said Tim Lucas, a senior director of engineering. Then he stopped himself. “The robot can work in three dimensions. You don’t necessarily have to have a material that’s cut off a sheet. You can create new, very interesting materials.”

Klaus reappeared, holding a half-full glass of a violet beverage he identified as Purple Rain—“a reminiscence to Prince,” he explained—procured from the campus smoothie bar. As he escorted me back through Laces, we passed a loft-like Maker­Lab, modeled after a hackerspace and stocked with bolts of textiles, bins of materials, and an array of machines for sewing, woodworking, and 3-D printing. In an atrium, employees congregated near full-size, living trees; they tapped at their laptops by an amphitheater, where TED-style talks are held regularly during lunchtime. The whole scene felt like a startup staffed by athletes.

At a time when the world’s most highly valued and influential companies hail from the West Coast, there is a powerful narrative in the business world that all companies should become tech companies or else risk obsolescence. As the adage goes: innovate or die. Members of the Future team spoke frequently and enthusiastically about their “open source approach” to research and development. When, in October, the AM4 series was announced, a video spliced footage of runners with footage from the Speedfactory, with a voice-over that mimicked the sound of an astronaut urgently transmitting over a weak radio link from the moon: “Athlete data-driven design,” the voice said, mysteriously. “Open source cocreation. Man and machine.” It sounded a bit like an algorithmically generated Silicon Valley word cloud. “Production line of innovations,” it continued. “Accelerated crafting from months to hours. Optimized for athletes.”

This isn’t the first time Adidas has emphasized technology in its products and their branding. In 1984 the company put out a shoe called Micropacer that held a small computer to calculate distance, pace, and calories. That same year it rolled out the Fire, a sneaker with removable foam inserts of varying densities. In recent years, Adidas has introduced a number of high tech, exclusive sneakers, including the Futurecraft 4D, which boast a 3-D-printed sole “crafted with light and oxygen.” Lately, Adidas has worked with more sustainable materials and recently released a number of products made with “Parley Ocean Plastic”: a recycled plastic collected in the Maldives by a nonprofit organization.

A cart full of the company’s proprietary Boost midsoles. ÉRIVER HIJANO
But perhaps more than the tangible qualities of products themselves, Adidas is altering the long-running scripts for the ways consumers build a narrative around fashion. With sneaker manufacturing so tied to sweatshops in Asia, companies like Adidas and Nike have long downplayed the origin stories of their products. But with the push toward sustainability, robotics, and personalized goods, Adidas is encouraging consumers not only to consider where their shoes come from but also to pay a premium for the origin story. Boost midsoles are already being produced in more traditional factories, such as those in China, and at a much higher volume. They don’t need to be made in a Speedfactory. Producing components that are usually made elsewhere in a high tech manufacturing environment struck me as less of a way to optimize a supply chain than a conceit—a story to be told. Tech, or at least its aesthetic, has a halo effect.

WHEN THE ATLANTA Speedfactory opens at the end of this year, it will bring about 160 new jobs. The party line is that Speedfactory’s robots will not replace humans but instead provide job opportunities for “upskilled” factory workers. Job listings include roles for quality inspectors, tailors, process engineers with robotics experience, and technicians with fluency in machining. The Speedfactories will produce about half a million pairs of shoes—just a sliver of Adidas’ total annual output, which runs close to 300 million. The Speedfactory sneakers, at least in the short term, are likely to be sold to a niche audience that’s willing to pay upward of $260 for a limited-edition pair of shoes.

Some economists are bullish on ideas like Speedfactory and see it as the start of a much larger trend. “We are finally escaping from the manufacturing trap that we’ve been in for the last 20 years,” says Michael Mandel, chief economic strategist at the Progressive Policy Institute in Washington, DC, referring to the mass offshoring of production to Asia. Improvements in automation can now finally substitute for cheap foreign labor, which will naturally push factories closer to where the consumers are. As manufacturing shifts from offshore mass production to customized, local fabrication, new jobs will open up for human workers, some of which have yet to reveal themselves. “We used to have distribution built around manufacturing,” Mandel says, referencing the centrality of offshore factories, “and now I think that manufacturing is going to be built around distribution.”

There’s a powerful narrative in business that all companies should become tech companies.

And yet, for the moment, there isn’t a ton of incentive for Adidas to back out of its global supply chain. The company has done extremely well in recent years: In the second quarter of 2017, sales grew by 21 percent, and all signs pointed to a gain on Nike, its primary competitor. “If you’re Nike and Adidas, you’re making enough money with a large workforce subcontracted through so many factories and so many countries, there’s no desperate urgency to change things around and invest in automation,” says Sarosh Kuruvilla, a professor of industrial relations at Cornell University. “People love to talk about how technology is changing the world, and there’s a lot of buzz around this kind of stuff. One has to look closely at the economics. I think it’s a much slower process.”

Instead, Kuruvilla sees Speedfactory less as a harbinger of large-scale change for all US manufacturing and more as one company’s attempt to keep pace with consumer expectations—expectations that are being set not by historic rivals like Nike but by trends in fast fashion and technology companies like Amazon. If consumers today expect rapid delivery and abundant choice, that’s in part thanks to Amazon Prime, Kuruvilla points out. Speedfactory, in other words, is Adidas’ attempt to develop the capacity to deliver customizable goods quickly. Adidas is already experimenting with embedding chips inside shoes—an approach that could one day collect data on consumer behavior, and in turn inform more customized designs.

This past spring, Amazon—which already has troves of data about buying and spending habits, and a direct line to consumers—received a patent for a manufacturing system that produces “on-demand” apparel. This is exactly the sort of advancement that Adidas’ Future team is bracing for, and, in many regards, hoping to beat.


Adidas uses a ball-kicking robot to test products at its headquarters. ÉRIVER HIJANO

During my visit, Adidas’ chief information officer, Michael Voe­gele, brought up the Amazon patent and compared the athletic apparel industry to incumbents in the taxi and hotel industries. “We didn’t want to be disrupted by the outside,” he said, explaining one impetus behind the Speedfactory. I was sobered by the prospect of yet another company being laid low by an online superstore that trafficks in cloud-computing services, whose algorithms recommended inflatable furniture alongside literature in translation.

The specter of the tech industry looms large, as both an aspiration and a threat. Thinking back on Voe­gele’s comments later as I trudged through the cobblestone streets of Nuremberg, I felt a wave of sadness and sympathy, two emotions I had never experienced on behalf of a corporation. All this talk of technological advancement and running shoes that can handle 90-degree corners. All this talk of innovation, the ocean plastic, the 3-D-printed midsoles. There was so much uncertainty. I wondered if we weren’t all just doing the same thing: working our hardest to find a foothold in the future, then trying to keep that hold for as long as we can.

Anna Wiener(@annawiener) lives in San Francisco and works in the tech industry.

This article appears in the December issue. Subscribe now.

Listen to this story, and other WIRED features, on the Audm app.

GOOGLE’S AI WIZARD UNVEILS A NEW TWIST ON NEURAL NETWORKS


IF YOU WANT to blame someone for the hoopla around artificial intelligence, 69-year-old Google researcher Geoff Hinton is a good candidate.

The droll University of Toronto professor jolted the field onto a new trajectory in October 2012. With two grad students, Hinton showed that an unfashionable technology he’d championed for decades called artificial neural networks permitted a huge leap in machines’ ability to understand images. Within six months, all three researchers were on Google’s payroll. Today neural networks transcribe our speech, recognize our pets, and fight our trolls.

But Hinton now belittles the technology he helped bring to the world. “I think the way we’re doing computer vision is just wrong,” he says. “It works better than anything else at present but that doesn’t mean it’s right.”

In its place, Hinton has unveiled another “old” idea that might transform how computers see—and reshape AI. That’s important because computer vision is crucial to ideas such as self-driving cars, and having software that plays doctor.

Late last week, Hinton released two research papers that he says prove out an idea he’s been mulling for almost 40 years. “It’s made a lot of intuitive sense to me for a very long time, it just hasn’t worked well,” Hinton says. “We’ve finally got something that works well.”

Hinton’s new approach, known as capsule networks, is a twist on neural networks intended to make machines better able to understand the world through images or video. In one of the papers posted last week, Hinton’s capsule networks matched the accuracy of the best previous techniques on a standard test of how well software can learn to recognize handwritten digits.

In the second, capsule networks almost halved the best previous error rate on a test that challenges software to recognize toys such as trucks and cars from different angles. Hinton has been working on his new technique with colleagues Sara Sabour and Nicholas Frosst at Google’s Toronto office.

Capsule networks aim to remedy a weakness of today’s machine-learning systems that limits their effectiveness. Image-recognition software in use today by Google and others needs a large number of example photos to learn to reliably recognize objects in all kinds of situations. That’s because the software isn’t very good at generalizing what it learns to new scenarios, for example understanding that an object is the same when seen from a new viewpoint.

To teach a computer to recognize a cat from many angles, for example, could require thousands of photos covering a variety of perspectives. Human children don’t need such explicit and extensive training to learn to recognize a household pet.

Hinton’s idea for narrowing the gulf between the best AI systems and ordinary toddlers is to build a little more knowledge of the world into computer-vision software. Capsules—small groups of crude virtual neurons—are designed to track different parts of an object, such as a cat’s nose and ears, and their relative positions in space. A network of many capsules can use that awareness to understand when a new scene is in fact a different view of something it has seen before.

Hinton formed his intuition that vision systems need such an inbuilt sense of geometry in 1979, when he was trying to figure out how humans use mental imagery. He first laid out a preliminary design for capsule networks in 2011. The fuller picture released last week was long anticipated by researchers in the field. “Everyone has been waiting for it and looking for the next great leap from Geoff,” says Kyunghyun Cho, a professor at NYU who works on image recognition.

It’s too early to say how big a leap Hinton has made—and he knows it. The AI veteran segues from quietly celebrating that his intuition is now supported by evidence, to explaining that capsule networks still need to be proven on large image collections, and that the current implementation is slow compared to existing image-recognition software.

Hinton is optimistic he can address those shortcomings. Others in the field are also hopeful about his long-maturing idea.

Roland Memisevic, cofounder of image-recognition startup Twenty Billion Neurons, and a professor at University of Montreal, says Hinton’s basic design should be capable of extracting more understanding from a given amount of data than existing systems. If proven out at scale, that could be helpful in domains such as healthcare, where image data to train AI systems is much scarcer than the large volume of selfies available around the internet.

In some ways, capsule networks are a departure from a recent trend in AI research. One interpretation of the recent success of neural networks is that humans should encode as little knowledge as possible into AI software, and instead make them figure things out for themselves from scratch. Gary Marcus, a professor of psychology at NYU who sold an AI startup to Uber last year, says Hinton’s latest work represents a welcome breath of fresh air. Marcus argues that AI researchers should be doing more to mimic how the brain has built-in, innate machinery for learning crucial skills like vision and language. “It’s too early to tell how far this particular architecture will go, but it’s great to see Hinton breaking out of the rut that the field has seemed fixated on,” Marcus says.

UPDATED, Nov. 2, 12:55 PM: This article has been updated to include the names of Geoff Hinton’s co-authors.

#ARTIFICIAL INTELLIGENCE#NEURAL NETWORKS#MACHINE LEARNING#COMPUTER VISION#GOOGLE

AN OLD TECHNIQUE COULD PUT ARTIFICIAL INTELLIGENCE IN YOUR HEARING AID


DAG SPICER IS expecting a special package soon, but it’s not a Black Friday impulse buy. The fist-sized motor, greened by corrosion, is from a historic room-sized computer intended to ape the human brain. It may also point toward artificial intelligence’s future.

Spicer is senior curator at the Computer History Museum in Mountain View, California. The motor in the mail is from the Mark 1 Perceptron, built by Cornell researcher Frank Rosenblatt in 1958. Rosenblatt’s machine learned to distinguish shapes such as triangles and squares seen through its camera. When shown examples of different shapes, it built “knowledge” using its 512 motors to turn knobs and tune its connections. “It was a major milestone,” says Spicer.

Computers today don’t log their experiences—or ours—using analog parts like the Perceptron’s self-turning knobs. They store and crunch data digitally, using the 1s and 0s of binary numbers. But 11 miles away from the Computer History Museum, a Redwood City, California, startup called Mythic is trying to revive analog computing for artificial intelligence. CEO and cofounder Mike Henry says it’s necessary if we’re to get the full benefits of artificial intelligence in compact devices like phones, cameras, and hearing aids.

Mythic’s analog chips are designed to run artificial neural networks in small devices. MYTHIC
Mythic uses analog chips to run artificial neural networks, or deep-learning software, which drive the recent excitement about AI. The technique requires large volumes of mathematical and memory operations that are taxing for computers—and particularly challenging for small devices with limited chips and battery power. It’s why the most powerful AI systems reside on beefy cloud servers. That’s limiting, because some places AI could be useful have privacy, time, or energy constraints that mean handing off data to a distant computer is impractical.

You might say Mythic’s project is an exercise in time travel. “By the time I went to college analog computers were gone,” says Eli Yablonovitch, a professor at University of California Berkeley who got his first degree in 1967. “This brings back something that had been soundly rejected.” Analog circuits have long been relegated to certain niches, such as radio signal processing.

Henry says internal tests indicate Mythic chips make it possible to run more powerful neural networks in a compact device than a conventional smartphone chip. “This can help deploy deep learning to billions of devices like robots, cars, drones, and phones,” he says.

Google’s AI Wizard Unveils a New Twist on Neural Networks
Henry likes to show the difference his chips could make with a demo in which simulations of his chip and a smartphone chip marketed as tuned for AI run software that spots pedestrians in video from a camera mounted on a car. The chips Mythic has made so far are too small to run a full video processing system. In the demo, Mythic’s chip can spot people from a greater distance, because it doesn’t have to scale down the video to process it. The suggestion is clear: you’ll be more comfortable sharing streets with autonomous vehicles that boast analog inside.

Digital computers work by crunching binary numbers through clockwork-like sequences of arithmetic. Analog computers operate more like a plumbing system, with electrical current in place of water. Electrons flow through a maze of components like amplifiers and resistors that do the work of mathematical operations by changing the current or combining it with others. Measuring the current that emerges from the pipeline reveals the answer.

That approach burns less energy than an equivalent digital device on some tasks because it requires fewer circuits. A Mythic chip can also do all the work of running a neural network without having to tap a device’s memory, which can interfere with other functions. The analog approach isn’t great for everything, not least because it’s more difficult to control noise, which can affect the precision of numbers. But that’s not a problem for running neural networks, which are prized for their ability to make sense of noisy data like images or sound. “Analog math is great for neural networks, but I wouldn’t balance my check book with it,” Henry says.

If analog comes back, it won’t be the first aspect of the Mark 1 Perceptron to get a second life. The machine was one of the earliest examples of a neural network, but the idea was mostly out of favor until the current AI boom started in 2012.


Objects identified in video by a simulation of a conventional smartphone chip tuned for artificial intelligence. MYTHIC

A simulation of Mythic’s chip can identify more objects from a greater distance because it doesn’t have to scale down the video to process it. MYTHIC
Mythic’s analog plumbing is more compact than the Perceptron Mark 1’s motorized knobs. The company’s chips are repurposed flash memory chips like those inside a thumb drive—a hack that turns digital storage into an analog computer.

The hack involves writing out the web of a neural network for a task such as processing video onto the memory chip’s transistors. Data is passed through the network by flowing analog signals around the chip. Those signals are converted back into digital to complete the processing and allow the chip to work inside a conventional digital device. Mythic has a partnership with Fujitsu, which makes flash memory and aims to get customers final chip designs to test next year. The company will initially target the camera market, where applications include consumer gadgets, cars, and surveillance systems.

Mythic hopes its raise-the-dead strategy will keep it alive in a crowded field of companies working on custom silicon for neural networks. Apple and Google have added custom silicon to power neural networks into their latest smartphones.

Yablonovitch of Berkeley guesses that Mythic won’t be the last company that tries to revive analog. He gave a talk this month highlighting the opportune match between analog computing and some of today’s toughest, and most lucrative, computing problems.

“The full potential is even bigger than deep learning,” Yablonovitch says. He says there is evidence analog computers might also help with the notorious traveling-salesman problem, which limits computers planning delivery routes, and in other areas including pharmaceuticals, and investing.

Something that hasn’t changed over the decades since analog computers went out of style is engineers’ fondness for dreaming big. Rosenblatt told the New York Times in 1958 that “perceptrons might be fired to the planets as mechanical space explorers.” Henry has extra-terrestrial hopes, too, saying his chips could help satellites understand what they see. He may be on track to finally prove Rosenblatt right.

#ARTIFICIAL INTELLIGENCE#CHIPS#ANALOG

MAPPING THE FUTURE: CARTOGRAPHY STAGES A COMEBACK

iiiiiiiiiii.jpg
TIM ENTHOVEN
CARTOGRAPHY IS THE new code. Increasingly, everything from your takeout delivery to your UberPool route is orchestrated not just by engineers but by cartographers. Between 2007 and 2015, the number of grads earning master’s degrees in cartography increased annually by more than 40 percent on average. And as advanced satellites, digital mapping tools, and open-source geographical software progress, the demand for cartographers is projected to grow nearly 30 percent by 2024.

Modern cartographers are as much data analysts as they are map producers. Flagship GIS systems by software companies like Esri have been democratized by an explosion of open-source alternatives like Carto and MapBox. “We are absolutely inundated with volumes of geospatial data,” says Mike Tischler, director of the US Geological Survey’s National Geospatial Program, “but with no means to effectively use it all.”

Which is why, as tasks from house-hunting to solving public-health crises depend on sophisticated map integration, cartography grads are being snapped up by Silicon Valley. “Ten years ago someone with geospatial expertise may have been siloed from the engineering team,” says Grubhub CTO Maria Belousova. “Today a huge portion of our team works on spatial search and route optimization.” Data-savvy mappers are charting that digital frontier.

This article appears in the November issue. Subscribe now.