Skip to main content


Tesla buys Perbix industrial automation company – Robotics and Automation News (press release)

By Automation, Automotive Industry, Editor's Choice, News, Sensors

Tesla has been working with engineering company Perbix for some time, and lately has faced increased pressure in its manufacturing process.

The company says it has faced manufacturing “bottlenecks” as it attempts to meet the high demand for its new Model 3 electric car.

Now, Tesla has apparently decided to invest more money in the manufacturing process.

Tesla last year bought Grohmann Engineering, an engineering company similar to Perbix.

According to Bloomberg, Perbix chairman James Dudley will receive $10.5 million in Tesla stock, although full financial details of the deals are yet to be disclosed.

Bloomberg adds that Tesla is now planning to expand its operations in Minneapolis, where Perbix is based.

Separately, Tesla is reported to be in talks with Chinese authorities about building factory on Shanghai.

Company invests some cold, hard cash in Rochelle –

By Automation, Editor's Choice, Materials Handling, News

ROCHELLE – Americold is expanding, adding an $80 million, 140-foot-tall freezer that will bring 70 new jobs to the city.

The company broke ground this week on its new facility, a frozen food warehouse and distribution center with an automated storage and retrieval system, marking its second addition since coming to this Ogle County town in 1995.

It will be attached to the warehouse at 1010 Americold Drive and create an additional 57,600 pallet positions for storage.

“It is a game-changer when a company integrates advanced technology into its operations. Rochelle is proud to be home to another one of Americold’s automatic storage and retrieval system distribution centers,” said Jason Anderson, director of the Greater Rochelle Economic Development Corp.

The project is scheduled to be complete by the end of next year, and the company’s footprint will total more than 33.5 million cubic feet of temperature-controlled warehousing and logistics space.

“Americold’s decision to expand in Rochelle is proof again that we are the Midwest hub for frozen food distribution; we appreciate their continued reinvestment in the city of Rochelle,” Mayor Chet Olson said in a news release.

Increasing the company’s storage capacity will allow them to take on more business in the Chicago area, Americold President and CEO Fred Boehler said.

“Working with some of our key partners, we identified the opportunity to update and expand our campus just an hour east of Chicagoland, and to offer both automated and conventional storage and distribution options,” Boehler said.


Call 815-562-1047, go to or find the company on Facebook for more information about Americold, 915 S. Caron Road and 1010 Americold Drive, in Rochelle.

The Atlanta, Georgia, based business operates 165 temperature-controlled warehouses across the globe, serves more than 2500 customers and employs 11,000 associates.

At Toyota, The Automation Is Human-Powered

By Automation, Automotive Industry, Editor's Choice, Sensors, technology
While the rest of the auto industry increasingly uses robots in manufacturing, Toyota has taken a contrarian stance by accentuating human craftsmanship

[Photo: courtesy of Toyota]

On the assembly line in Toyota’s low-strung, sprawling Georgetown, Kentucky factory, worker ingenuity pops up in the least expected places. For instance, normally in auto plants installing a gas tank is a tedious, relatively complicated procedure. Because the tank is so heavy, a crane usually positions and holds it against the skeletal frame while employees tighten its straps and bolts from under the chassis, a strained and time-consuming maneuver that requires keeping arms up in the air for long periods of time.

To allay the obvious shortcomings in this process, a group of Toyota workers designed an ingenuous device–a multi-armed piece of industrial machinery that in a single action lifts the tank in the air, places it in its crevice and reaches underneath the vehicle’s skeletal body to permanently attach the tank to the chassis. The process is fast, seamless, and ergonomically safe.

Freed from securing the bolts, the workers (the designers of this new device) would seem to be superfluous. However, that’s not the case at all. Indeed, the same number of employees as before are still at that assembly station. But instead of turning bolts in cramped crannies, they are doing the types of less obviously essential human tasks that manufacturers tend to eliminate when automation is introduced; namely, painstaking tactile and visual inspections to check and double-check for flaws on the tank and its connections and for holes or weaknesses in the critical fuel line.

The central role that people play in this corner of the Georgetown plant is repeated in varying degrees throughout the factory and exemplifies the uniqueness of Toyota’s manufacturing philosophy, which while still cutting edge continues to curiously nod to the past. Even as the automaker unveils an updated version of its vaunted production system, called the Toyota New Global Architecture (TNGA), the company has resisted the very modern allure of automation–a particularly contrarian stance to take in the car industry, which is estimated to be responsible for over half of commercial robot purchases in North America.

“Our automation ratio today is no higher than it was 15 years ago,” Wil James, president of Toyota Motor Manufacturing in Kentucky, told me as we sat in his office above the 8.1-million-square-foot (170 football fields) factory. And that ratio was low to begin with: For at least the last 10 years, robots have been responsible for less than 8% of the work on Toyota’s global assembly lines. “Machines are good for repetitive things,” James continued, “but they can’t improve their own efficiency or the quality of their work. Only people can.” He added that Toyota has conducted internal studies comparing the time it took people and machines to assemble a car; over and over, human labor won.

The Robotic Mystique

Such thinking seems unorthodox but it’s not surprising given Toyota’s well-known manufacturing system, which was first popularized in The Machine That Changed the World, an unlikely best-seller in the early 1990s written by three MIT academics. Despite its dry subject, this book had a radical impact inside and outside of the business community–for the first time, unveiling the mysteries of Japanese industrial expertise and popularizing terms like lean production, continuous improvement, andon assembly lines, seven wastes or mudas and product flow. With the publication of The Machine That Changed the World, it became de rigueur for every large and small manufacturer to at least give lip service to emulating Toyota’s production strategy.

But as the decades past, you’d be forgiven if you thought that many of the balletic set of assembly line systems depicted in The Machine That Changed the World was anachronistic, especially the ones involving the contribution of human workers. Fundamentally, Toyota’s production principles were keyed to the notion that people are indispensable, the eyes, ears, and hands on the assembly line–identifying problems, recommending creative fixes, and offering new solutions for enhancing the product or process. Today, that idea seems quaint. In the industrial world now manufacturing prowess is measured more by robotic agility than human ingenuity. As an aspiration, lean is taking a back seat to lights-out–a manufacturing concept Elon Musk is championing for his Model 3 Tesla plant in which illumination will ultimately not be needed because the factory will be devoid of people . Even before we get there, auto companies like Kia–headquartered in Korea where the use of robots in manufacturing outpaces all other countries–are already claiming productivity improvements of nearly 200% from automation. Some plants have more than 1,000 robots–and less than a thousand people–on an assembly line.

Indeed, a nearly fetishistic appreciation of automation is ubiquitous these days. Dozens of articles, white papers, and books, written by respected thought leaders, executives, and consultants, depict an industrial future inevitably overrun by robots able to do the most sophisticated tasks at inhuman levels of efficiency. These are siren calls to most manufacturers whose growth plans are conditioned on cutting labor costs, which often make up as much as 25% of the value of their products. Some of the Pollyannaish views about the onslaught of robots foresee a period of unprecedented free time for individuals to cater to the whims of their imagination, turning us all into freelance artisans and entrepreneurs. Other, more sober forecasts worry about what people will do without the satisfaction of a job and the stability of a paycheck. Either way, a revolution awaits us, so goes the conventional wisdom. An oft-quoted Oxford University analysis predicts that over the next two decades, some 47% of American jobs will be lost to automation. In China and India, that figure is even higher: 77% and 69% respectively.

Links To The Past

But Toyota has forged a different path. The automaker, now jockeying with Volkswagen and Renault-Nissan for the top spot in worldwide sales, consistently generates industry best profit margins, often 8% or more. To maintain this performance, Toyota has eschewed seeking growth primarily through cost-cutting (read automation), but rather has focused on automobile improvements offered at aggressively competitive prices. Codified as the Toyota New Global Architecture, this strategy doesn’t primarily target labor to reduce production expenses but instead is weighted toward smarter use of materials; reengineering automobiles so their component parts are lighter and more compact and their weight distribution is maxed out for performance and fuel efficiency; more economical global sharing of engine and vehicle models (trimming back more than 100 different platforms to fewer than ten); and a renewed emphasis on elusive lean concepts, such as processes that allow assembly lines to produce a different car one after another with no downtime. In TNGA’s framework, robots are not the strategic centerpiece, but merely enablers and handmaidens, helping assemblers do their jobs better, stimulating employee innovation and when possible facilitating cost gains.

As if to punctuate how old-school this way of thinking is today, Toyota made an unusual executive appointment in 2015. Unexpectedly, the automaker named Mitsuru Kawai, a 52-year veteran of the firm (he was hired at 15), to head up global manufacturing, the highest position ever held by a former blue-collar worker. Kawai is one of the last remaining links at Toyota to Taiichi Ohno, the godfather of lean manufacturing and the Toyota production system. Ohno, who died in 1990, idealized the importance of seasoned and practiced individual workers to the success of the organization. Kawai recalled with some nostalgia how this attitude elevated employee self-regard and in turn the quality of Toyota’s assembly line. When he first started at the company, experienced factory employees were called gods because they were masters that could make anything by hand. Regrettably, he said, more recently Toyota had less appetite for “making use of human skills and wisdom.”

Kawai’s job now is to imbue TNGA with Ohno’s memory by bringing human craftsmanship back to the fore. Soft-spoken and unassuming, Kawai described the manufacturing philosophy he uses to achieve this as uncomplicated: “Humans should produce goods manually and make the process as simple as possible. Then when the process is thoroughly simplified, machines can take over. But rather than gigantic multi-function robots, we should use equipment that is adept at single simple purposes.”

A Series Of Elementary Innovation

Aspects of TNGA are being implemented in most Toyota factories around the world. But Toyota has invested in a $1.3 billion overhaul of Georgetown–its largest plant where 550,000 Camrys, Avalons, and Lexuses are produced each year–as TNGA’s pilot site before disseminating the new system globally. The imprints of Kawai and Ohno are already evident in large and small ways in the Kentucky facility. For instance, the outsized overhead conveyer belts that used to carry a steady stream of engines to the assembly line have been swapped out for moving pedestals that skate across the factory guided by electronic sensors in the floor. This new engine delivery system (which is, after all, merely a machine replacing a machine) accomplishes a series of manufacturing goals. By eliminating the complex web of conveyer belts, Toyota is able to downsize its plants considerably, essentially to one story from as many as three. That, in turn, results in substantial savings on construction, real estate, cooling, heating, and maintenance, some of the highest costs in managing a factory network.

In addition, the pedestals’ payloads are computer-directed, each engine matched directly with customer purchases. Which means that Toyota can theoretically make three SUVs for every sedan one hour and do the opposite the next, depending on market orders. Such flexible, one-by-one production is the elusive Holy Grail of the auto industry.

And equally significant, freed of the conveyer belts, the assembly line workspaces are relatively uncluttered, cleared of pulleys, tubes, and pipes. As a result, assemblers can spend their limited amount of time with the automobile–usually less than a minute–completing their tasks and checking for defects while not wasting seconds navigating inelegantly around it.

A more rudimentary innovation in Georgetown that dovetails perfectly with TNGA tenets is the floating chair ,or raku seat (raku roughly means easy in Japanese). This assembly aid glides along rails in and out of the vehicle and then front to back inside the car, giving installers unimpeded access to difficult-to-reach spots like the dashboard console without having to bend or squeeze into awkward positions. The Toyota employee that designed this device patterned it after the moving swivel chair in his bass fishing rig–and used a seat from his boat to beta test the concept.

Besides its ergonomic benefits, the raku seat prunes seconds off of the production process, which is a persistent goal of Toyota’s manufacturing systems. Repeated across the assembly line, trimming small slices of time adds to up to meaningful productivity benefits. “In our world, we see work in 55-second bursts,” said Kentucky Toyota manufacturing president James. “And we challenge our workers to chop a second or more off if they can. If we gain back 55 seconds throughout the factory, we can ultimately eliminate a job and move that worker to another slot where they can begin the innovation process over again. Humans are amazing at finding those stray seconds to remove.”

Among the more compelling experiments underway in Georgetown is a training exercise meant to infuse the TNGA idea that automation should solely grow organically out of human innovation. To this end, assemblers were given a karakuri assignment–a lean management drill that requires workers to build a Rube Goldberg-inspired contraption that operates under its own force to improve a workspace activity. One team is reengineering the flow rack, the ubiquitous stand next to each assembly station that holds the parts needed for the local task. Currently, as shelves are emptied, workers have to manually set them aside and then replace them with a full bin of parts. The “modernized” version will instead rely on a combination of springs, ropes, and weights to navigate this task after a button is pressed. When this decidedly low-tech device is perfected, Toyota plans to use the prototype as the blueprint for a robot to emulate the process.

Toyota Emerges From A Debacle

TNGA, which Toyota expects will reduce manufacturing expenditures by as much as 40%, emerged from a dark period in the early 2000s when the automaker overeagerly attempted to outpace General Motors and Volkswagen as the world’s No. 1 vehicle manufacturer. Toyota has admitted that by juicing production growth too rapidly at that time, quality, manufacturing controls and factory productivity were allowed to lag. So much so that in 2009 Toyota had its first loss in 59 years (in part due to the global financial depression) and during the next two years recalled more than 10 million vehicles after a spate of sudden acceleration accidents. In 2014, Toyota agreed to pay a record $1.2 billion penalty to end a criminal probe by the U.S. Justice Department into its alleged attempts to mislead the public and hide the true facts about the dangerous problems with its vehicles. Toyota’s CEO Akio Toyoda apologized publicly and abjectly for the company’s failures and said the automaker was “grasping for salvation.” An internal soul-searching followed, which in turn led to the new manufacturing system and ultimately to Kawai’s appointment to lead its implementation and a return to craftsmanship.

As part of the TNGA rollout, Kawai has demanded that factories establish manual workspaces for critical plant processes, in some cases eliminating automation where it had already been installed during the period that Toyota overtaxed its production capacity. Kawai’s goal is twofold. First, to ensure that Toyota’s workers have the expertise and skills to manufacture a car by hand even if they wouldn’t be called upon to fully do so again. Kawai believes that without this body of knowledge assemblers become myopic, focused solely on their small part of the operation and blinded to their responsibility to design improvements for the larger team effort that are required to consistently produce high-quality vehicles. Worse yet, in this narrow outlook, workers often mistakenly see robots as replacements for people rather than basic tools that can be used to enhance factory performance.

Kawai’s second aim in replacing automated factory zones with people is to revisit with a clear mind–removed from the anxiety of a surge in production volume–whether robots have actually improved efficiency in individual plant activities. Some of the results of this experiment are unexpected. For instance, in a Japanese Toyota factory where workers have taken over forging crankshafts out of metal from automated equipment, subsequent innovations have reduced material waste by 10% and shortened the crankshaft production line by 96%.

Is The Robot Threat A Fantasy?

Toyota’s aversion toward automation is noteworthy for the obvious reason that the automaker is arguably one of the most creative and successful manufacturing companies in history, and has never followed the herd but rather set the course. However, beyond that, also worth examining more closely is the question raised by Toyota’s choice of direction: Do robots kill jobs or create them? Toyota, of course, would argue that while some manufacturers eagerly embrace automation and more will in the future, on a larger scale (and ironically in the more innovative and pioneering factories) robots are best used to precipitate more human plant activity rather than reduce it.

Recent analyses of employment data support this somewhat contrarian point of view. In one bit of research, James Bessen, a Boston College law professor, found that although automation has been increasingly prevalent in all types of services and manufacturing industries since 1950, in that time only one of 270 occupations categorized by the Census Bureau was eliminated by technology; namely, the elevator operator. Other jobs were partially automated and in many cases, automation led to more jobs, often higher-skilled positions at companies that used technology to design and develop new products and new ways to reach customers.

For instance, ATMs have radically altered consumer-banking habits, yet the number of branch employees has grown since money machines were first installed during the late 1990s. “Why didn’t employment fall?” writes Bessen. “Because the ATM allowed banks to operate branch offices at lower cost; this prompted them to open many more branches (their demand was elastic), offsetting the erstwhile loss in teller jobs.” And simultaneously, banks morphed into financial services companies, introducing an array of customized products that tellers were deputized to sell, giving behind-the-cage clerks the same opportunities for upward job mobility as deskbound bankers used to have.

In addition, historically there is a direct correlation between productivity growth, which robots should naturally contribute to, and job creation not explained by population gains. In theory, companies able to manufacture products more quickly and efficiently will reinvest the money from higher sales in assets and innovation and, in turn, additional workers. Or they may lower prices, which drives more consumer spending, higher GDP, and an improved employment outlook. A trenchant study on this topic by the Information Technology and Innovation Foundation illustrated the relationship between productivity and employment by examining economic data of the post World War II era. ITIF found that in the 1960s, when U.S. productivity grew 3.1% per year, unemployment averaged 4.9%. [JR1] A couple of decades later, annual productivity growth had fallen to just under 2% and unemployment rates averaged 7.3%. And in the 1990s and early 2000s in the wake of the internet boom, annual productivity growth was nearly 3% again; in turn, the unemployment rate declined. From 2008 to 2015, productivity gains had ticked downward once more to only 1.2%; concomitantly, the rate of jobs creation has been sluggish compared to the pre-recession period.

These productivity statistics lead to a few significant conclusions about automation today. For one thing, robots thus far do not make up a significant portion of manufacturing activities–responsible for only around 10% of the work in factories, according to some estimates–and companies that have embraced automation have yet to see significant gains from it since productivity growth continues to trend downward. Moreover, the effect that automation has had on employment has been muted. Another bit of data is worth mentioning in this regard: Workers are not leaving occupations as frequently as they did in past decades–the rate of occupational change in the 2000s is 45% lower than the 1940–1980 period and 33% lower than the 1990s, according to the Economic Policy Institute.

Robert Atkinson, ITIF president, believes that robots will in fact have a substantial presence in global factories before too long, although he doesn’t view automation as a threat to human jobs. He asserts that the productivity slump reflects a slowdown in innovation recently. Technology waves lasting as long 50-years have traditionally transformed society and revitalized economies but IT has stalled out, at the bottom of the S curve, Atkinson argues. “In the 1990s we went from dialup to 3 megabit broadband, that was transformative,” Atkinson says. “But going from 10 megabit to 50 megabit is not. Same thing with how much chips progressed in capabilities in the 1990s, but no more.”

As Atkinson sees it, the somewhat labored abilities of artificial intelligence are holding back robotic skills. That position is shared by John Launchbury, director of DARPA’s Information Innovation Office, who says that AI is within reach but yet a distance away from the type of contextual adaption that true factory automation requires; in other words, AI systems still lack cognition skills to understand and manipulate underlying explanatory models and identify and analyze real-world objects. According to Launchbury, today’s second wave AI systems are capable of statistical learning; based on millions of bits of data, they can separate one voice from another or a cat from a dog, among many other more complex distinctions. A contextual adaption system, though, “could say if a specific animal it sees with little more than a cursory glance has ears, paws, or fur and how they differ from another animal in the most minute ways,” he says.

When that level of AI is available, Atkinson argues, the next technological wave–the robot era–will have arrived. And annual productivity could increase to as much as 3.5%. “Which will create hundreds of thousands of jobs for people working with and around robots,” he says.

That, anyway, is what Toyota is counting on–or, better yet, cutting its own curve to make sure it happens.

Surveying: ‘The future is here’ – KHL Group

By Automation, Editor's Choice, Engineering and construction, Sensors, Structural monitoring, technology

Abdce contex capture copy

The days when surveying meant a group of people holding up poles and measuring angles and distances, marking out a site with yet more poles, are long gone, and the techniques used today are becoming more and more sophisticated all the time.

BIM (building information modelling) is a term that was coined only a few short years ago, but is now the key to unlock the data needed on a big project. And the basic information that allows BIM to hold that powerful position, can be sourced easily from so many different places – even the sky, with drones increasingly playing a part.

However, all these new technologies and the possibilities they offer have to be harnessed.

Elżbieta Bieńkowska, EU Commissioner for Internal Market, Industry, Entrepreneurship and SMEs, wrote in the introduction to the Handbook for the Introduction of Building Information Modelling by the European Public Sector, “Similar to other sectors, construction is now seeing its own digital revolution, having previously benefited from only modest productivity improvements.

“Building Information Modelling is being adopted rapidly by different parts of the value chain as a strategic tool to deliver cost savings, productivity and operations efficiencies, improved infrastructure quality and better environmental performance.”

She said, “The future is here, and the moment has now come to build a common European approach for this sector. Both public procurement – which is accountable for a major share of construction expenditure – and policy makers can play a pivotal role to encourage the wider use of BIM in support of innovation and sustainable growth, while actively including our SMEs – and generating better value for money for the European taxpayer.

In the handbook’s executive summary, it says, “The prize is large: if the wider adoption of BIM across Europe delivered 10% savings to the construction sector then an additional €130 billion would be generated for the €1.3 trillion market.

“Even this impact could be small when compared with the potential social and environmental benefits that could be delivered to the climate change and resource efficiency agenda.”

Roads and bridges

One of the leading companies in this area is US-based Bentley Systems. Santanu Das, senior vice president, design modelling, said that one of the biggest advances in information modelling was its use not only in buildings, but also in transportation and heavy civil engineering projects like roads and bridges.

He said there was increased use in brownfield projects.

“Brownfield projects require some sort of starting data,” he said. In the past, 2D drawings were the starting point, then a 3D model.

“One advancement that came out years ago was point clouds – LiDAR,” said Das. “The issue with LiDAR was two big things – it was quite bulky and expensive, you can only do it once every four or five years. The data that are generated would be in the terabytes sometimes and there was nothing really available to process it properly.”

He said a third problem was classification.

“If you took a point cloud, you had no idea what the hell those points meant. A human can figure out, that’s a wall, that’s a column, but in order to do what we call classification, automatically, was impossible.”

He continued, “So what Bentley’s been working on in the past couple years on its BIM platform is reality modelling, and that’s now all a part of our Connect Edition platform.”

Santanu das gc1 1677

Connect Edition converges Bentley’s platform technology to support a hybrid environment across desktop modelling applications, cloud services, on-premise servers, and mobile apps.

“Every single Connect Edition product we have – from Building Designer to OpenPlant to OpenRoads – uses this fundamental datatype that we create from our ContextCapture piece in there.

“That’s one huge advantage that we have for people who want to start off on brownfield projects.”

Bentley can now process this information in the cloud. Das said that with LiDAR and any type of photogrammetry, the number of pictures captured could be astronomical.

“What we doing out with our new ContextCapture in the cloud is that we have the ability to process hundreds of thousands of these pictures in a very, very quick manner, because we’re using the power of multiple servers.

“Then we stream that information as needed to the BIM platform via our ProjectWise.”

With ProjectWise, what Bentley calls i-models can be combined together with other documents into a single package, so that models and associated content can be accessed on an iPad using Bentley mobile apps.

“Reality modelling classification is huge,” said Das. “The other thing that we are finding in information modelling today and the advancement in BIM is the collaboration aspect.”

While people can work together, share data together, Das said there was a problem of a lack of basic terminology of communication.

“So what we have done is to work really hard to come up with a common terminology for all asset class types. So if we’re talking about a beam in a building, or a beam in a plant scenario, it understands what a beam is.”

Some years ago, Bentley introduced i-models, which Das described as “a sort of pdf for the AEC (architecture, engineering and construction) industry”.

He said, “We’ve taken that to the next level. We’re going to be introducing this thing called the i-model hub, which allows for data to flow from discipline to discipline, and different hierarchies.”

He said there were different levels of detail.

“The hub can filter out the information depending on what your role is, and what your discipline is. It also manages change – which is huge – it’s communicating and constantly keeping that model up to date.”

This communication can be with the products of other companies, too.

“We believe in third party interoperability,” said Das.

The daily visualisation of a jobsite can help minimise construction delays, prevent clashes between the work onsite and the design, eliminate the need for rework, facilitate stakeholder communication and align schedules.

The Pix4D Crane Camera claims to combine hardware and software to help with this. A camera is mounted on a tower crane jib, from where it captures site images. These are transmitted wirelessly to the Pix4D cloud, and processed automatically to be converted into 2D maps and 3D models.

The company behind it said it was designed to monitor any type of construction, and had already been endorsed by some large companies worldwide.

Early adopter

A metro line project in western France was among the early adopters. Dodin Campenon Bernard, part of the Vinci group, was awarded a 14km project that included a tunnel and underground stations.

The station to be monitored was in the heart of the city centre and made drone flights, which was one option, impossible.

At 32m of digging depth and with massive brace frames to support the excavation, the building site was said to be a challenge. However, through the data collected with the crane camera, Dodin Campenon Bernard was able to follow the evolution of the site day by day.

Romain Nicolas, deputy technical director at Dodin Campenon Bernard, said, “Projects are complicated – unforeseen circumstances can happen and delay the project. This kind of projects take a few years to achieve, and meanwhile, can highly perturb the neighbourhood.”

He said it was crucial to communicate progress on the construction site, and share visual updates from the site to local residents and all stakeholders.

In Zurich, Switzerland, it was a railway bridge that was the focus for Pix4D. The capacity of the Zurich rail network and surrounding region was felt to have reached its limit, and Porr Suisse, part of one of the largest Austrian construction groups, was given the job of expanding the railway infrastructure. This included the construction of a 200m bridge and a new track.

Swiss company BSF Swissphoto was in charge of surveying the infrastructure. It used the crane camera to document the current situation of the site, capturing data daily.

The weekly work progress reports produced were said to have improved communication and collaboration between the companies and subcontractors involved.

Pix4 d hospital denmark

A large new hospital complex being built in Denmark covers more than 150,000m2, and had 13 cranes erected.

Pix4D said that with BIM and digital construction technologies, this project was a perfect example of a connected site. The contractors have been continually testing new technologies, and selected the crane camera to be a part of the project.

The results were said to have quickly revealed to be a huge time saver for the project team. Although the project team was based on the work site, the camera was situated on the other side of the site, meaning a long walk to check the building status, which could take a few hours. With a permanent monitoring solution like the crane camera, data has been automatically available when needed, enabling the project team to get information quickly, and make faster decisions when it came to confirming or realigning the schedule.

Pix4D said that combining crane camera use with drones could ensure the most complete aerial site overview, from the earliest earthwork stages of a project.


Drones go under several aliases – UAS (unmanned aircraft system) and UAV (unmanned aerial vehicle), for example.

Trimble is collaborating with Propeller Aero to distribute its UAS analytics platform. Propeller, based in Australia and the US, said it was a leader in the advanced collection, visualisation and analysis of data from drones.

Trimble said Propeller’s simple automated ground control targets, cloud-based visualisation and rapid analysis platform would also be integrated with Trimble Connected Site solutions to bring “an end-to-end cloud-based UAS solution to civil engineering and construction contractors”.

It said that pairing Propeller’s web-based interface with Trimble Connected Site solutions would allow users to unlock the full value of UAS information.

Texo dsi

Users can get access to simple tools to measure surface geometry, track trends and changes across time, and perform visual inspections. Trimble said that both technical and non-technical professionals were now able to gather insights remotely and collaborate. It added this would drive improvements in safety and efficiency as well as reducing environmental impact across a construction worksite.

Scott Crozier, director of marketing for Trimble Civil Engineering & Construction, said, “Propeller combines ease of use with powerful analysis tools that allow users to view 2D and 3D deliverables and extract valuable information.

“Like Trimble, Propeller understands the value of quality and accurate data for integration with civil engineering and construction workflows.”

Rory San Miguel, CEO of Propeller Aero, said, “We pride ourselves in taking the most trusted, technical data and tools, and wrapping that up in an easy-to-use online platform that is relevant to the entire organisation, not just technical users.

“Integrating our platform into Trimble’s Connected Site solutions will bring a new class of information to construction sites and organisations globally.”

Also working with UAVs, Texo Drone Survey & Inspection (DSI) said that with UAVs, a big part of keeping on top of potential challenges involved talking to clients ahead of them encountering particular issues, and developing bespoke platforms that mee their needs precisely, by engineering solutions from the bottom up.

It said it had been investing in technology that allowed for heavier payloads and enabling its fleet of UAVs to operate under more difficult weather conditions.

The UAVs currently in operation can deal with wind speeds of up to 15m/s, with the flexibility to carry a variety of custom payloads. Texo DSI has permits for operations up to 20kg, which it said was a game changer for the construction sector.

The company said that a standard LiDAR survey, accuracy of data is generally to around 40mm. However, it claimed that through investment and development of its LiDAR UAV fleet and associated survey software, it was achieving accuracy of 1 to 3mm with its survey grade UAV integrated LiDAR system. This system is delivered via a custom-built UAV platform that measures over 1 million points per second.

Topcon Positioning Group has added advanced connectivity options to its DS-200i direct aiming imaging station.

The DS-200i, now with wi-fi access, provides real-time, touchscreen video and photo imaging to capture measured positions.

Ray Kerwin, director of global surveying products, said, “The ultra-wide 5 MP on-board camera provides photo documentation in the field and can now transmit live video using either LongLink or high-speed WLAN as an access point, which allows the FC-5000 or Windows 10 tablets easily to connect.

“The addition of Wi-Fi connectivity offers convenience to the powerful video capabilities of the DS-200i. The system allows for non-prism measurements to be aimed and measured to remote objects – saving time without having to return to the tripod.”

He added, “The live video allows a remote user to know exactly what is being measured.”

Additional standard features include Hybrid Positioning functionality, Xpointing technology for quick and reliable prism acquisition, TSshield telematics security and maintenance technology, and a rating of IP65 for water-resistant construction.

GNSS suported

Leica Geosystems has just released Leica Spider v7.0 software suite, which is now said to support all GNSS (global navigation satellite systems) – GPS, GLONASS, BeiDou, Galileo and QZSS, as well as the GPS-L5 signal for improved network RTK (real time kinematic) correction services.

The all-in-one solution is said to offer users working on surveying and mapping, among other tasks, improved positioning accuracy and correction service. Leica said that professionals could now increase productivity while they operated reliably in environments with obstructions, like urban canyons, or at high latitudes, thanks to the higher number of usable satellites from multiple GNSS constellations.

Leica said that for the first time, all important GNSS network information was available in one “convenient and easy-to-access web user interface”. The Leica Spider Business Centre web portal is said to combine all the elements to operate the infrastructure efficiently, including user and access management, and network and rover status monitoring.

Leica spider v7.0

Markus Roland, product manager for GNSS Networks and Reference Stations, said, “Our goal with this new version is to incorporate the latest developments into our solution to continue our history of pioneering in GNSS.

“We strive to deliver reliable productivity improvements for our customers. With the new Spider v7.0, customer benefits are tangible and quality is ensured.”

Another new surveying technology which is increasingly apparent on jobsites is augmented and virtual reality (AR and VR).

In the UK, Scotland’s University of Strathclyde’s Advanced Forming Research Centre (AFRC) and the Advanced Manufacturing Research Centre with Boeing (AMRC) in Sheffield, South Yorkshire, have been working with Glasgow-based design visualisation company Soluis Group and modular building designer and manufacturer Carbon Dynamic.

Together, they claim to have successfully built a demonstrator for the use of AR and VR in the construction industry.

The technology was first trialled on a 2.2m plasterboard wall which, when viewed with a Microsoft HoloLens, showed a 3D rendering of the plumbing and wiring behind the façade.

The system can also be used to examine different wall parts to ensure there are no gaps in insulation before being sent to a construction site.

David Grant, partnership development leader at the AFRC, said, “This new technology has a role to play before, during and after construction of both domestic and commercial properties.”

Strathclyde afrc

Strathclyde afrc 3

He said that before work starts, those involved in a construction project would be able accurately to visualise and walk through a building before the foundations were even dug. He said this would help in identifying any potential issues before they occur.

And a Danish BIM-software company is claiming that for the first time, construction workers will be able to see a mix of reality and digital drawings from their smartphone

Dalux has launched what it says is the world’s first AR technology that works on mobile devices, and shows a mix of construction drawings and reality – based on what is being looked at and the location.

Jakob Andreas Bærentzen, associate professor at the Danish Technical University Compute, said he was impressed that an AR product was mature enough to aid in the construction industry already.

He said, “Dalux’s AR-technology already seems to be useful in practice. This is several years earlier than I expected we would see such solutions.

“It makes the accomplishment even more impressive that the software can handle large amounts of data and is mature for practical use on mobile devices – that are not designed for such tasks in the construction industry as the HoloLens is.”

Dalux co-founder Bent Dalgaard said, “Now, at most large construction projects, a digital BIM model is often created. We can access these drawings through mobile devices, based on the construction worker’s location, and show it as AR.

“The fact that the technology can be used on mobile devices makes the adoption in the construction industry much faster, since everybody has a smartphone or tablet these days, and HoloLens is much more expensive, meaning that not all workers have access to the AR drawings.”

Real-time collaboration

Another company, HoloBuilder, which provides 360° reality capturing of construction sites, is releasing a product featuring new capabilities for real-time collaboration and offline handover for project close-out.

HoloBuilder offers a scalable SaaS (software as a service – licensed on subscription) solution. It is said to be a collection of all features that HoloBuilder offers as a collaborative enterprise package – 360° reality capturing with the JobWalk mobile app, TimeTravel for progress documentation, the measurement tool to measure within 360° images, and annotations.

The company said that users could now collaborate with the whole team and enjoy enterprise level service and security. HoloBuilder lets entire construction project teams contribute to the documentation process.

During project close-out, the project can be downloaded and saved as a view-only deliverable for the owner to keep throughout the lifetime of the building.

Robots running the meatworks: Southern hemisphere’s largest fully automated cold storage plant opens in Gympie

By Automation, Editor's Choice

It started with a small butcher shop in Gympie’s Mary Street in the 1950s. Now Nolan Meats has just opened the largest fully automated meat chilling and distribution centre in the southern hemisphere.

Pat and Marie Nolan founded Nolan Meats in 1958. Their sons Tony, Michael and Terry operate the family-owned, vertically integrated business, which employs 405 people across four sites, including three feed lots to ensure consistent supply.

With this latest venture, Nolan Meats director Terry Nolan expected it would take several more months to fully commission the high-tech, multi-million dollar cold storage plant in Gympie.

“This has been five years in the planning,” Mr Nolan said.

“We thought we needed to look at our cold chain security. We’d previously leased premises in Brisbane and we wanted to build a very modern cold store distribution facility.”

The technology was selected to ensure full traceability and increase efficiencies in high volume handling of chilled and frozen beef.

“So instead of doing 500 cattle a day we might get to 1,000 cattle per day.

“Meat processing plants are quite complex. Because carcases are of different sizes, shapes and thicknesses it’s very hard to automate much of that.

“But once you have all your cuts off the carcass and you put them into a standard-sized box it becomes very easy to automate it.

“A part of freezing beef is to freeze it as efficiently as you can to keep the product as fresh as you can and in an 18-hour cycle we can have all that meat hard frozen to -25°C.

“It presents better when it gets to international customers.

Around 70 per cent of Nolan Meats product is sold domestically, with the remainder exported to Japan, Korea, USA, Indonesia, Malaysia and Egypt.

The upgrade is also expected to deliver efficiencies in processing the chilled beef that is popular in Australia.

“We all know that beef as a protein can be costly for producers,” Mr Nolan said.

The distribution centre has a capacity of 86,000 cartons and is around 100 metres long, with shuttles servicing 23 levels.

Once a carton is ordered through the computer it is automatically collected and delivered via conveyor belt to be packed on pallets.

Each carton is identified by barcode, allowing beef to be sorted by cut, MSA eating quality, and market destination.

“The beauty of this system is that the salesman can sit in their office, enter an order and in theory nobody touches the carton,” Mr Nolan said.

“Once it leaves the boning room, it’ll go into a cold store through the automated storage and retrieval system.

“A mixed order can go in and come out in perfect order to deliver to about 300 butchers between the Tweed River and Rockhampton.”

“Those butchers are the lifeblood of our business and some butchers might buy one carton, some might buy six cartons so we could have eight to 10 butchers in one pallet and it’s all come in perfect order and no human hand has touched it until it gets to the butcher shop.”

Despite the move towards mechanisation, Terry Nolan said the company hoped to expand its workforce in the future.

“A part of that is that we need to extend our boning room as well.

“That site could have 700–800 people.”

Korean wholesalers and distributors were invited to the opening and taken to the Muster Cup race day to experience Australia.

“Korea is a very important to us,” Mr Nolan said.

“A few years ago when Russia was accused of shooting down a Malaysian airlines flight, trade embargoes were brought against Russia.

“We were sending nightly flights to Moscow with high quality beef. That market was closed on a political stance, so it’s important to us to have every market in the world open because sometimes they’re very favourable and sometimes they’re very difficult.

“At the moment Korea, while it’s an important market to us, it is quite difficult with the influx of US beef and we’re finding that Australian beef is being slightly displaced in the Korean market because of cheap US beef.

“We’re making our focus around the Asia-Pacific, so Japan, Korea, Malaysia, Indonesia, and the USA.”

“Nolan Meats is Gympie’s largest private employer. That the family has made such a significant investment shows confidence in the region and there’s certainly an opportunity for further job creation,” Councillor Curran said.

“Today is only possible because of our enthusiastic and driven team of Nolan people who have been integral to this development and enabled this dream to become a reality,” Michael Nolan said.

Dependent on funding arrangements, Nolan Meats is also interested in becoming involved in research and development of DEXA technology, using CT scanning to help determine the eating quality of live cattle and beef carcases as well as advancing boning automation.

“If we can get an accurate skeletal diagram of the carcass and then adopt that into an automated chain where we can have a robot programmed to cut a particular joint, that’s where the real benefits would come,” Terry Nolan said.

It just goes to show how much has changed since that first butcher shop opened in 1958.

The Grueling Robot Race That Launched the Self-Driving Car

By Automation, Automotive Industry, Editor's Choice, laser distance sensor, Sensors

On March 13, 2004, a gaggle of engineers and a few thousand spectators congregated outside a California dive bar to watch 15 self-driving cars speed across the Mojave Desert in the first-ever Darpa Grand Challenge. (That’s the Defense Advanced Research Projects Agency, the Pentagon’s skunkworks arm.) Before the start of the race, which marked the first big push toward a fully autonomous vehicle, the grounds surrounding the bar teemed with sweaty, stressed, sleep-deprived geeks, desperately tinkering with their motley assortment of driver­less Frankencars: SUVs, dune buggies, monster trucks, even a motorcycle. After the race, they left behind a vehicular graveyard littered with smashed fence posts, messes of barbed wire, and at least one empty fire extinguisher.

What happened in between—the rush out of the starter gate, the switchbacks across the rocky terrain, the many, many crashes—didn’t just hint at the possibilities and potential limitations of autonomous vehicles that auto and tech companies are facing and that consumers will experience in the coming years as driverless vehicles swarm the roads. It created the self-­driving community as we know it today, the men and women in too-big polo shirts who would go on to dominate an automotive revolution.

I. The Challenge

In 2001, eager to keep soldiers away from harm in combat zones, the US Congress demanded that a third of the military’s ground combat vehicles be uncrewed by 2015. But defense industry stalwarts weren’t innovating quickly enough on the sensor and computing technologies that would enable autonomous driving. So in February 2003, Tony Tether, the director of Darpa, announced a 142-mile race for self-driving cars , open to anyone, with a $1 million prize for whoever finished its course the fastest. Tether held a kickoff event at the Petersen Automotive Museum in Los Angeles for prospective racers.

Related Stories

TONY TETHER (Darpa): I thought, “Hell, maybe we’ll get five or 10 people to show up.”

JOSE NEGRON (Darpa): The defense contractors that Darpa had been working with, they got stuck in a mind meld. They were thinking step by step, evolutionary, and they weren’t progressing. So we needed a revolutionary approach, a leap forward, and I kept telling Tony there would be hundreds of people interested in joining us, people who do not do DOD business. People who work late at night, in their garages and bedrooms, because they love what they do. For them, failure is OK as long as they learn something. But he didn’t believe me until he arrived at the Petersen for the kickoff.

TETHER: We were going to open the doors at 9 in the morning, and by the time I got there at 8:30 there was a line of people four abreast going all the way around the block. These were just ordinary people who dreamed of having a car that was driverless.

NEGRON: I told them this was gonna be a 142-mile challenge, in the desert. You’ll be going through switchbacks, up and down, paths as narrow as 10 feet, and the vehicle’s gonna have to sense its way through the course. They had a year to figure it out. One guy asked, “Why are you making the race so hard?” I said, “That’s what makes it the Grand Challenge.”

TETHER: Before all this, Darpa’s self-driving programs were working to minimize human intervention. That was the wrong criteria. We were trying to get it so a human never has to be involved.

JOSEPH BEBEL (Team Palos Verdes High School): My mom is an engineer and heard about the challenge. I was looking for new and interesting things to work on, so we brought the challenge to my high school as an extracurricular project.

DAVID HALL (Team DAD—Digital Auto Drive): We were appearing on BattleBots and Robot Wars around that time, which was mostly an excuse for advertising our Velodyne loudspeaker. Then, when this Darpa thing came about, we jumped on it as an opportunity to show off some computer skills and do something fun.

MELANIE DUMAS (Team Axion Racing): When we went to the Petersen event, we found an angel investor who gave us a bit of starter cash. We named our vehicle Spirit of Kosrae, for the island in Micronesia from which our investor was importing bottled water to sell.

JIM MCBRIDE (Ford Engineer): At the time I was working on automotive safety at Ford, and when I heard about the race I volunteered to work at it to see how remote sensing had progressed and to see if there was anything that could be carried over into Ford’s safety program. But most people thought that the notion of a car driving itself was for crackpots.

SAL FISH (Course Designer): I get a phone call from a guy saying, “I’m with Darpa, and we’re putting together an event with autonomous vehicles. Would you help us make a race course?” I made the course. It had rocks on it, left turns, right turns, dips, gullies, cactus all around the place. A drop-off on one side and then 5 miles later a drop-off on the other side. Barbed wire fences, animals that could come out of nowhere, train tracks. Anything and everything that a vehicle would encounter going through the desert.

NEGRON: I hired off-road champions to drive the route. To a man they said, “This is tough, even for us.”

TETHER: We got 106 team applications. We made them write technical papers, explaining what they were going to do. To cut it down, we’d go visit them. One guy’s wife came to the door and asked, “Are you the guys who have got my husband trying to mortgage this house to build his car?”

BEBEL: The basic nature of the challenge was, follow the course they designed using GPS and sensors to avoid obstacles. We wanted to build something that, at least in theory, could complete the race. Our car was a pretty standard Acura SUV with a laser range-finding sensor on the front bumper. The laser created a one-dimensional picture of the road in front of you—if there was an obstacle, it could tell you where it was.

DUMAS: We had a Jeep Grand Cherokee, and we tried to keep it looking as normal as possible so that people wouldn’t think autonomy was so futuristic or out of reach. We also put a couple of surfboards on the roof, just to stand out. We ran parallel sets of algorithms to guide the vehicle. One took GPS data, one took camera input, one took laser range-finding data, and they all fed into a voting system, which decided where the car was and where it needed to go.

ALBERTO BROGGi (Team Terra­Max): Our team specialized in machine vision, so we put three cameras on top of our vehicle—a bright-green, 3-axle military truck—along with two lidar sensors to spot obstacles ahead.

CHRIS URMSON (Red Team, Carnegie Mellon): We realized that, at the speed we needed to drive to finish the race in time, our Humvee wouldn’t be able to see that far ahead. So we gathered satellite imagery of the area. Whatever route Darpa gave us, we’d use those maps to figure out the best path, and our Humvee would use laser sensing to see the shape of the road.

TETHER: Anthony Levandowski was there, the fellow who’s alleged to have taken data from Google to Uber. He had a self-balancing motorcycle. It did a great job in the qualifying rounds.

NEGRON: Levandowski says motorcycles always win off-road races because of speed. And this was a speed event. That kid is a great engineer. He’s in a slight bit of trouble now. [Levandowski, the recently fired head of self-driving at Uber, declined to comment for this story.]

II. The Race

As race day approached, Darpa selected the 25 teams that had shown the most impressive progress and invited them to a qualifying round at California Speedway in Fontana, where they underwent safety inspections and technical tests. Fifteen teams performed well enough to advance. On March 13, 2004, the teams assembled in Barstow, two hours northeast of Los Angeles, to begin the 142-mile drive through California’s Mojave Desert, to Primm, just over the Nevada border.

FISH: We started at a saloon called the Slash X. A bloody shit-­kicking, cowboy-type place.

RED WHITTAKER (Red Team, Carnegie Mellon): It was a cold desert morning, and everything from a motorcycle to an Oshkosh giant military truck showed up.

FISH: My God, these vehicles were something out of Mad Max.

HALL: It was like being at Woodstock for nerds.

NEGRON: Tony’s idea was not to tell anybody where we were going until two hours before the event, then give them a CD with all the waypoints.

TETHER: If we’d given out the coordinates weeks before, someone could have gone out and run the course. This way, they had to prepare their vehicle for everything. It added a little mystique, too.

WHITTAKER: We had a trailer, like a command center, and once we got the disc we went to work. Very quickly the vehicle team was running through everything from the fuel to the engine to the electronics, checking that everything was working.

HALL: You drive your car up to the line and you hop out of it and press the button, and it goes off by itself. It’s the damnedest thing you’ve ever seen.

WHITTAKER: Our vehicle, Sandstorm, was first off the line, so all that mattered was staying ahead.

TETHER: I’m sitting with this four-star general in the grandstand, and when Sandstorm goes off he just went, “Jesus.”

NEGRON: Sandstorm was doing 40 miles an hour.

TETHER: It was really tough terrain. The start was pretty flat, then they had to go up a high hill, with a lot of switchbacks.

NEGRON: The first few vehicles went out without a hitch. The back half of the field … had some issues.

DUMAS: Our Jeep Cherokee came out of the chute, and it made the first turn. And then on the second turn, it made a U-turn and headed back to the starting line. I think it saw the gate as too narrow, and the sensors decided to send the car back to where it came from. We made it 20 feet. That was a little devastating.

TETHER: Levandowski’s two-wheeler, he got so excited, he forgot to throw the switch for the stabilization and it fell right over. It was too bad. I thought it had a shot at being faster than the four-wheelers. But it didn’t go a foot. Another little car went up a berm and flipped over. Another had a GPS problem—it tried to go through a barbed wire fence and got tangled up.

BEBEL: We’d been having a problem with the steering on our Acura, which we called Doom Buggy, for weeks, and we couldn’t test our final fix before the race. It failed, and Doom Buggy couldn’t turn. The race officials let it run 50 yards or so until it hit the concrete sidewall.

BROGGI: We put together all the Oshkosh’s software at the last minute, and it couldn’t discriminate between different types of obstacles. Our laser scanners picked out some brush and decided it was an unmovable obstacle.

TETHER: The Oshkosh truck backed up, and it saw another tumbleweed in the back. So it just went back and forth. Fourteen tons, taken down by two tumbleweeds.

BROGGI: We had no way to move around it. Almost a year of working on the project and we were finished in minutes.

SEBASTIAN THRUN (Former head of Google’s self-­driving project): These vehicles didn’t fail because they weren’t rugged enough. They failed because they didn’t take in enough environmental information. None of them saw anything.

By the time the last of the 15 vehicles left the starting gate, the land around the Slash X saloon was a robotic graveyard. The race organizers and observers pinned their hopes on the four vehicles that were still driving an hour after departure, especially David Hall’s Velodyne pickup and Carnegie Mellon’s Sandstorm, last seen ripping through the desert flats and headed for the course’s marquee challenge: a winding road up and over a steep hill called Daggett Ridge.

HALL: There was this one hill you had to get over, at mile seven. And if you made it over that hill, it was absolutely straight, flat all the way to the finish line.

FISH: I was going, “My God, they’re gonna do it.”

WHITTAKER: The important thing that we had to avoid on that hill was going off-center. But a week before qualifying, we rolled our Humvee and damaged the sensors on our roof. It had the effect of the sensors looking a little right, and driving the vehicle a little left.

TETHER: Sandstorm got straddled.

WHITTAKER: We went a touch too far over the edge, and that’s all it took. The tire just spun and spun until it burned. Smoke was pouring off of it. The race officials hit the emergency kill, and the race was over for Sandstorm. I thought, “Oh hell, it’s a robot, I’m not going to get emotional about it.” I just made sure the team was OK, and I told them I appreciated the great thing they’d just done.

TETHER: We paused the race. David Hall’s vehicle was about half a mile behind, still going. But when the race was paused, it rolled up against a rock.

HALL: It was there for an hour or two while they were trying to get Sandstorm out of the way. We were back at our tent, waiting for information from them. We had no idea what the hell was happening.

TETHER: We cleaned up the track and started Hall’s car up again.

HALL: When they started it back up again, it wouldn’t jump over this rock underneath the front tire.

TETHER: And that was the end of it. I got in the helicopter and flew to Primm, where all the TV crews were waiting to see the cars cross the finish line. It was 11 o’clock in the morning, and they said, “How’s it going?” I said, “Well, it’s over. One car went 7.4 miles, it started smoking, blah blah blah. It’s over.” A reporter asked, “Well, what are you gonna do?” I said, “We’re gonna do it again, and this time it’s going to be a $2 million prize.” It was so successful and yet so not successful, I had to do it again.

III. the Aftermath

Despite the slap-stick failures of the first race, most of the participants—and many onlookers—saw the same possibilities that Tether saw when he announced the next race.

TETHER: I was disappointed, but it was a spectacular thing, and the people involved understood—this had never been done before.

THRUN: None of what is happening in self-driving today would have happened without the original challenge—it created a new community. They were all newcomers, and the innovation doesn’t come from the core of the field itself but from the outside. The experts are usually the lowest performers, because they’re totally bound in their way of thinking. Very few self-­driving car people knew anything about machine learning at the time, for example. David Hall’s Velodyne lidar development was triggered by the Grand Challenge.

HALL: At the first challenge the sensors were all unreliable—they would cause the vehicle to stop or veer off course. So a year after the first Grand Challenge, I started working on developing a new lidar sensor, with a bunch of lasers giving me a 360°, 3-D view. Darpa convinced me to build these sensors and sell them to the other teams for the third challenge.

THRUN: Around 2008, Larry Page convinced me to build ­Google’s self-driving car team. I met the best people in the field from the Darpa challenges. I hired Chris Urmson and Anthony Levandowski.

URMSON: There was an incredible sense of camaraderie, and that community fostered the folks who are leading a lot of the technology today.

TETHER: And that all comes from the fact that in 2004, some crazy-ass people decided to have a challenge.

the oral historians, Then and Now

Tony Tether

Then: Director of Darpa (2001–2009)

Now: Private technical consultant

David Hall

Then: CEO of Velodyne, a loudspeaker company

Now: CEO of Velodyne, Now making lidar laser sensors

Sal Fish

Then: CEO of Score International, which organized desert races like the Baja 1000

Now: Retired

Red Whittaker

Then: Robotics researcher at Carnegie Mellon

Now: Same

Jose Negron

Then: Darpa’s day-to-day manager for the Grand Challenge

Now: Cyberwarfare consultant

Melanie Dumas

Then: Voice- recognition engineer

Now: Program manager on Google’s security team

Alberto Broggi

Then: Lead on Team TerraMax

Now: General manager of VisLab, a computer vision developer

Sebastian Thrun

Then: Machine-learning researcher at Stanford

Now: Head of online university Udacity

Joseph Bebel

Then: Student at Palos Verdes High School

Now: PhD student in computer science at USC

Jim McBride

Then: Engineer in Ford’s safety department

Now: Autonomous vehicle specialist at Ford

Chris Urmson

Then: Robotics student at Carnegie Mellon

Now: Head of Aurora Innovation, a self-driving startup

Alex Davies (@adavies47) is a senior associate editor at WIRED.

This article appears in the August issue. Subscribe now.