“Why aren’t we relevant?” Architects and their place in Britain’s housing crisis

A misleading impression of the architect's job? A scale model of London on display at last year's MIPIM real estate conference. Image: Getty.

Even post-2008, Britain’s worsening housing crisis still lingers around the fringe of the political radar. In London, while official figures put house building requirements at 49,000 units per year – and economic research suggests a figure closer to 60,000 – output in 2014 was a meagre 18,700.

Double-digit growth in property values, a depleted social housing stock, exploding private rents and continued foreign investment have culminated in the all-too-familiar reality of a crisis of affordability – not to mention a rapidly rising £25bn housing benefit bill, of which a quarter is spent on private rents in London and increasing homelessness.

Suggested solutions to these problems are not in short supply, and go beyond simply building more homes. We could allow local authorities the finance to engage in their own house building programmes, for example. We could introduce rent caps within a more regulated private rental sector; tackle the issue of land banking; encourage smaller independent contractors and self-builders to create a more diverse end product. All are plausible reactions to this situation.

The role of architects in all of this lies in a somewhat hazy landscape determined by the mechanisms of politics, powerful house building firms and the complex nature of the real estate market. In effect, architects succumb to the reality of being employed by a client, normally to carry out works within a highly regulated framework, and arrive far too late in the political and real estate food chain to be of any real significance in initiating how the built environment is produced. The overarching failure to solve the housing crisis has not been down to the architects, or even developers; rather, it’s because of limp public policy.


Architects are, to their credit, well-trained in spinning several bureaucratic plates at once, managing, coordinating and tip-toeing their way to the end goal of practical completion. Balancing numerous vested interests and regulatory risks, while possessing enough business acumen to make the task worth their while, the architect has clung on for many years while being derided from all corners and accused of leading the built environment to ruin.

In a process captured in the grainy black and white images of dreary Modernist estates, public trust has been slipping ever further away from architects for decades. Yet the fact is that the vast majority of what we build has little to do with an architect at all.

For example, a high proportion of architects in southern Europe currently fund themselves unemployed as commissions, except for a few pre-crisis top-down investment projects, have become increasingly scared. As the construction industry began to falter, architects were among the first to be deemed disposable and wholly unnecessary as budgets were increasingly squeezed. This is not down to “bad” architects: it is down to the fact developers rarely actually need to use architects, or spend any time or money on design.

However, we now live in a time in which we are seeing a subtle, yet potentially potent, shift in future models of housing, particularly in London. The market has failed us; now we are gradually seeing cash-strapped second-tier level government bodies and councils motivated by targets in the housing sector.

In isolated examples such as Camden and Hackney, councils are becoming their own developers. Benefiting from the absurd levels of property value growth in London, the boroughs are seeking the opportunity to cross-subsidise their own schemes by providing private as well as social accommodation.

Last November, a report revealed that 40 per cent of brownfield land in London is still owned by the public sector: that means that effective house building by local authorities would go some way to plugging the gap. Where the local authorities remain impotent, however, is in the resources and know-how to carry out successful development of the sites which they hold.

This is where architects have something to offer in a world which fails to produce high quality housing. They often find themselves retreating into comfortable fields of design, based purely on formal properties – a phenomenon undeniably caused by the way in which architecture is generally still taught in the UK.

But knowledge of proportion, light, space and so on form the architect’s most reliable set of skills. Instead of considerations on form, the tools and knowledge which architects pick up across other fields, almost unknowingly along their career path, have huge potential within an institution which has a genuine necessity to build, namely local government.

All this runs the risk of appearing overly nostalgic. Older members of the profession have long reminded us of the golden days, reciting to younger colleagues their favourite bedtime story of times during the 1970s when the public sector employed half of Britain’s architects.

Yet as we speak, programmes are being drafted which provide placements for young architects seeking experience in the public sector: these should be wholly encouraged. Issues of viability, strategic development and planning policy are all inevitably part of the architect’s remit: often, though, they do not feature in their day to day work, because of the which in a building is procured.

One solution to the housing crisis is to provide the facility for local authorities to engage in their own house building programs: this is a far better alternative than creating a liberalised planning system, which will weaken the very last powers of the architect to act as guardian of quality and longevity.

Architects must have faith in public and semi-public organisations to maximise the benefits of the huge swathes of land which remain in public hands – and develop these as part of an overall long term plan.

Thomas Feary is an MA graduate in architecture, and works in practice and as a writer in London. He tweets as @thomasfeary.

 
 
 
 

What does the fate of Detroit tell us about the future of Silicon Valley?

Detroit, 2008. Image: Getty.

There was a time when California’s Santa Clara Valley, bucolic home to orchards and vineyards, was known as “the valley of heart’s delight”. The same area was later dubbed “Silicon Valley,” shorthand for the high-tech combination of creativity, capital and California cool. However, a backlash is now well underway – even from the loyal gadget-reviewing press. Silicon Valley increasingly conjures something very different: exploitation, excess, and elitist detachment.

Today there are 23 active Superfund toxic waste cleanup sites in Santa Clara County, California. Its culture is equally unhealthy: Think of the Gamergate misogynist harassment campaigns, the entitled “tech bros” and rampant sexism and racism in Silicon Valley firms. These same companies demean the online public with privacy breaches and unauthorised sharing of users’ data. Thanks to the companies’ influences, it’s extremely expensive to live in the area. And transportation is so clogged that there are special buses bringing tech-sector workers to and from their jobs. Some critics even perceive threats to democracy itself.

In a word, Silicon Valley has become toxic.

Silicon Valley’s rise is well documented, but the backlash against its distinctive culture and unscrupulous corporations hints at an imminent twist in its fate. As historians of technology and industry, we find it helpful to step back from the breathless champions and critics of Silicon Valley and think about the long term. The rise and fall of another American economic powerhouse – Detroit – can help explain how regional reputations change over time.

The rise and fall of Detroit

The city of Detroit became a famous node of industrial capitalism thanks to the pioneers of the automotive age. Men such as Henry Ford, Horace and John Dodge, and William Durant cultivated Detroit’s image as a centre of technical novelty in the early 20th century.

The very name “Detroit” soon became a metonym for the industrial might of the American automotive industry and the source of American military power. General Motors president Charles E. Wilson’s remark that, “For years I thought what was good for our country was good for General Motors, and vice versa,” was an arrogant but accurate account of Detroit’s place at the heart of American prosperity and global leadership.

The public’s view changed after the 1950s. The auto industry’s leading firms slid into bloated bureaucratic rigidity and lost ground to foreign competitors. By the 1980s, Detroit was the image of blown-out, depopulated post-industrialism.

In retrospect – and perhaps as a cautionary tale for Silicon Valley – the moral decline of Detroit’s elite was evident long before its economic decline. Henry Ford became famous in the pre-war era for the cars and trucks that carried his name, but he was also an anti-Semite, proto-fascist and notorious enemy of organised labor. Detroit also was the source of defective and deadly products that Ralph Nader criticized in 1965 as “unsafe at any speed”. Residents of the region now bear the costs of its amoral industrial past, beset with high unemployment and poisonous drinking water.


A new chapter for Silicon Valley

If the story of Detroit can be simplified as industrial prowess and national prestige, followed by moral and economic decay, what does that say about Silicon Valley? The term “Silicon Valley” first appeared in print in the early 1970s and gained widespread use throughout the decade. It combined both place and activity. The Santa Clara Valley, a relatively small area south of the San Francisco Bay, home to San Jose and a few other small cities, was the base for a computing revolution based on silicon chips. Companies and workers flocked to the Bay Area, seeking a pleasant climate, beautiful surroundings and affordable land.

By the 1980s, venture capitalists and companies in the Valley had mastered the silicon arts and were getting filthy, stinking rich. This was when “Silicon Valley” became shorthand for an industrial cluster where universities, entrepreneurs and capital markets fuelled technology-based economic development. Journalists fawned over successful companies like Intel, Cisco and Google, and analysts filled shelves with books and reports about how other regions could become the “next Silicon Valley”.

Many concluded that its culture set it apart. Boosters and publications like Wired magazine celebrated the combination of the Bay Area hippie legacy with the libertarian individualism embodied by the late Grateful Dead lyricist John Perry Barlow. The libertarian myth masked some crucial elements of Silicon Valley’s success – especially public funds dispersed through the U.S. Defense Department and Stanford University.

The ConversationIn retrospect, perhaps that ever-expanding gap between Californian dreams and American realities led to the undoing of Silicon Valley. Its detachment from the lives and concerns of ordinary Americans can be seen today in the unhinged Twitter rants of automaker Elon Musk, the extreme politics of PayPal co-founder Peter Thiel, and the fatuous dreams of immortality of Google’s vitamin-popping director of engineering, Ray Kurzweil. Silicon Valley’s moral decline has never been clearer, and it now struggles to survive the toxic mess it has created.

Andrew L. Russell, Dean, College of Arts & Sciences; Professor of History, SUNY Polytechnic Institute and Lee Vinsel, Assistant Professor of Science and Technology Studies, Virginia Tech.

This article was originally published on The Conversation. Read the original article.