The story of the world’s smallest skyscraper

Scraping the sky. Almost. Image: Solomon Chaim at Wikimedia Commons.

According to Emporis, a real estate data company, a skyscraper is a “multi-story building whose architectural height is at least 100 metres”. By that measure, the Newby-McMahon building in Wichita Falls, Texas, which is widely known as the “world’s smallest skyscraper”, isn’t actually a skyscraper at all.

In fact, it’s not even close – the building is four storeys and 12 metres tall, which in most peoples’ minds makes it little more than a house with ideas above its station. When it was built in 1919, skyscrapers weren’t reaching the heights they are today – but even then, the Newby-McMahon wouldn’t have cut an impressive figure next to the 241 metre Woolworth building in New York, the world’s tallest building at the time.

Newby-Mcmahon.png

Newby-McMahon alongside its major worldwide skyscraper competitors.


Unfortunately for its investors, the building’s limited stature came as shock to pretty much everyone – apart from the man who built it.

J. D. McMahon was the owner of the Wichita Falls oil company, whose offices occupied a one-story brick building on the corner of Seventh and La Salle. Next door was a vacant lot, and during the local boom sparked by the discovery of oil in 1912, he decided to meet the city’s growing demand for office space by turning it into a new skyscraper. The building would, plans appeared to show, be 480 feet (146 metres) tall – not bad for a small city barely past its 40th birthday. 

McMahon drew up blueprints and plans to show investors, who promptly gave him a total of $200,000 (around $2.7m at today’s prices) to get going on construction. Preferring to keep things in-house, he decided to use his own construction company to build the structure. 

This might be why it took the investors a little while to realise they’d been had. Slightly too late, it became apparent that McMahon was not, in fact, building a 480 foot tower: he was building a 480 inch one. The investors tried to bring a lawsuit against him, but the judge found that they didn’t have a case: they’d signed off on the original blueprints. Sure enough, these promised that the building would be 480" tall, and not, as they’d assumed, 480'.

Construction was completed, if you can call it that, in 1919. The building was 12 feet long, 9 feet wide and 40 feet tall. The elevator company had pulled out, so there wasn’t even a way to get from one floor to the next. And McMahon hadn’t even asked for permission to build on the land. None of this bothered him, however – he disappeared from the town, and probably the state, shortly after, presumably with a good chunk of the investors’ $200,000 in his back pocket.

In his absence, the building became the city’s problem. During the oil boom, it had been an embarrassment; during the depression that followed, it was a liability. For a while, the building was occupied by two firms (the extra-narrow stairs that were added later took up around a quarter of the floor space); later it was boarded up.

For the rest of the 20th century the block was occupied by a string of barber shops and cafes, and on multiple occasions it was scheduled for demolition, but it somehow survived to be palmed off onto a local heritage society. However, the building remained controversial. In 1996, Ralph Harvey, of the Wichita County Historical Commission told a reporter from Texnews, “I’ve never understood why some people make such a big deal about it. But about half of the people around here want to save it. The other half would prefer it just to be hauled off.”

In the end, the first half won out, and the building was restored to its former, er, glory. Today it’s a local tourist attraction, with an antiques dealership on the ground floor and an artist’s studio upstairs.

newby mcmahon.jpg

The plaque adorning the building today. The date is that of the completion of the one-story building next door. Image: Solomon Chaim at Wikimedia Commons

The Newby-McMahon has often been used as a symbol of the gullibility of the boom era: of the eventual realisation that no, the emperor isn’t wearing any clothes, the petroleum boom won’t last, and this building is not, by any definition, a skyscraper. Yet Fodor’s 2008 guide to Texas, which prides itself on highlighting “the best this big and beautiful state has to offer”, names the Newby-McMahon building as a must-see attraction. If those investors had known, maybe they’d have hung on to it.

 
 
 
 

Smart cities need to be more human, so we’re creating Sims-style virtual worlds

The Sims 2 on show in 2005. Image: Getty.

Huge quantities of networked sensors have appeared in cities across the world in recent years. These include cameras and sensors that count the number of passers by, devices to sense air quality, traffic flow detectors, and even bee hive monitors. There are also large amounts of information about how people use cities on social media services such as Twitter and foursquare.

Citizens are even making their own sensors – often using smart phones – to monitor their environment and share the information with others; for example, crowd-sourced noise pollution maps are becoming popular. All this information can be used by city leaders to create policies, with the aim of making cities “smarter” and more sustainable.

But these data only tell half the story. While sensors can provide a rich picture of the physical city, they don’t tell us much about the social city: how people move around and use the spaces, what they think about their cities, why they prefer some areas over others, and so on. For instance, while sensors can collect data from travel cards to measure how many people travel into a city every day, they cannot reveal the purpose of their trip, or their experience of the city.

With a better understanding of both social and physical data, researchers could begin to answer tough questions about why some communities end up segregated, how areas become deprived, and where traffic congestion is likely to occur.

Difficult questions

Determining how and why such patterns will emerge is extremely difficult. Traffic congestion happens as a result of personal decisions about how to get from A to B, based on factors such as your stage of life, your distance from the workplace, school or shops, your level of income, your knowledge of the roads and so on.

Congestion can build locally at pinch points, placing certain sections of the city’s transport networks under severe strain. This can lead to high levels of air pollution, which in turn has a severe impact on the health of the population. For city leaders, the big question is, which actions – imposing congestion charges, pedestrianising areas or improving local infrastructure – would lead to the biggest improvements in both congestion, and public health.

We know where – but why? Image: Worldoflard/Flickr/creative commons.

The irony is, although modern technology has the power to collect vast amounts of data, it doesn’t always provide the means to analyse it. This means that scientists don’t have the tools they need to understand how different factors influence the way cities function and grow. Here, the technique of agent-based modelling could come to the rescue.

The simulated city

Agent-based modelling is a type of computer simulation, which models the behaviour of individual people as they move around and interact inside a virtual world. An agent-based model of a city could include virtual commuters, pedestrians, taxi drivers, shoppers and so on. Each of these individuals has their own characteristics and “rules”, programmed by researchers, based on theories and data about how people behave.

After combining vast urban datasets with an agent-based model of people, scientists will have the capacity to tweak and re-run the model, until they detect the phenomena they’re wanting to study – whether it’s traffic jams or social segregation. When they eventually get the model right, they’ll be able to look back on the characteristics and rules of their virtual citizens, to better understand why some of these problems emerge, and hopefully begin to find ways to resolve them.

For example, scientists might use urban data in an agent-based model to better understand the characteristics of the people who contribute to traffic jams – where they have come from, why they are travelling, what other modes of transport they might be willing to take. From there, they might be able to identify some effective ways of encouraging people to take different routes or modes of transport.


Seeing the future

Also, if the model works well in the present time, then it might be able to produce short-term forecasts. This would allow scientists to develop ways of reacting to changes in cities, in real time. Using live urban data to simulate the city in real-time could help to inform the managers of key services during periods of major disruption, such as severe weather, infrastructure failure or evacuation.

Using real-time data adds another layer of complexity. But fortunately, other scientific disciplines have also been making advances in this area. Over decades, the field of meteorology has developed cutting-edge mathematical methods, which allow their weather and climate models to respond to new weather data, as they arise in real time.

The ConversationThere’s a lot more work to be done before these methods from meteorology can be adapted to work for agent-based models of cities. But if they’re successful, these advancements will allow scientists to build city simulations which are driven by people - and not just the data they produce.

Nick Malleson, Associate Professor of Geographical Information Systems, University of Leeds and Alison Heppenstall, Professor in Geocomputation, University of Leeds.

This article was originally published on The Conversation. Read the original article.