7. Health Care and Information Technology–Co-evolving Services

This chapter tells a story of seventy-five years of coevolution that has connected the practice of health care with the science and technology of information. It moves from experience of health care in the remote village life of my childhood to that in global village life today. It explores decades of transition onto a new landscape of disciplines, professions and services, played out within rapidly changing social, economic and political contexts. This transition has been described as turning the world of health care upside down, from an Industrial Age to an Information Age–the former grouped around service providers and the latter with a more patient-centred focus. Changing means and opportunities for preventing and combating disease have succeeded in saving lives and extending lifespans, albeit with increased years of ageing life often spent living with chronic and incurable conditions. The contributions of good nutrition, clean environment, shelter, sense of community and security to longer lifespan and healthier lifestyle, understood now in greater detail, give pause for thought about the balance, continuity and governance of health care services. Three contrasting commentaries on this era of change are introduced–from industry, science and social commentators of the times.

With the arrival of new measurement and computational methods, spanning from genome to physiome science and to population level informatics and now machine intelligence, the Information Age has pressured health services with continually changing challenges, characterized by what has been described as ‘wicked problems’, the nature of which is discussed. Wholly new industries, providing products and services for diagnosis and treatment, many of these increasingly offered directly to citizens, have grown in scope and scale. In an era when powerful new treatments have come with increased risk of harm to patients, ethical and legal aspects of care services and their governance frameworks have come under increasing public and regulatory scrutiny. The changing scenes of education, assessment of competence to practice, accountability for care services, clinical risk, patient safety and research, are introduced, all dependent on the quality of relevant sources of information.

This kaleidoscopic image of change sets the scene for discussion of the increasingly centre stage focus on information policy. The timeline of wide-ranging policy initiatives and related organizational changes in the UK NHS, such as sought to improve safety, contain costs, and improve outcomes for patients, is reviewed. This starts with seminal documents and policy goals from fifty years ago, highlighting issues then identified that have remained unresolved through the intervening years, despite huge public and international investment and opportunity cost in relation to competing priorities. Changing needs and increased expectations of citizens continue to challenge the status quo. This situation is reassessed, fifty years on, setting the scene for the programme for reform envisaged in Part Three of the book. The chapter concludes with a rueful reflection on the rush to computerize that has contributed significantly to the anarchy experienced in health care of recent decades, characterized as a gold rush.

The most conspicuous example of truth and falsehood arises in the comparison of existences in the mode of possibility with existences in the mode of actuality.

–Alfred North Whitehead (1861–1947)1

In the legend of Daedelus and Icarus, Icarus flew too close to the sun and his waxed-together wings melted and brought him down. By flying too low, in his escape over the sea from Crete, he would have risked the wings becoming waterlogged by spray, also risking bringing him down. There was a narrow range of viable altitudes. The flux of energy emitted from the sun sustains life on earth and provides and enables us with the energy and wings needed to fly. It can also bring us down. Physics toys with the idea of information as a form of energy. Information is in continuous flux in life, and its corruption or misuse can bring us down, too. Genetic mutations, epidemics, manipulative distortions of news and financial crashes all have common threads of information in flux.

Tracking photons from the sun into the cascades of mechanisms in living organisms is fascinating science. Tracking information of all kinds through environment and human society is hard to think about logically and carefully, but important, too. It is not experimental science, in the same way that economics or sociology are not and cannot be. There is only one laboratory and there is little rigorous controlled experiment possible. We can base decisions on an imagined and projected reality, but Whitehead’s caution about this, headlined again, above, must be heeded. In times of great change–and contemporary information anarchy signifies great change–black swans arrive, flap their powerful wings, and multiply.

In seeking health care benefit from investment in information technology, we should take heed of the story of Icarus, and rather avoid flying too high or too low, buoyed too high on hubristic wing and feeble prayer, or staying too low, unimaginative, incurious or cripplingly risk averse. In the slowly maturing landscape of health care information systems, there lurks much ageing and obsolescent string and sealing wax.

The scope of this chapter is very large, as was that of Chapter Two on knowledge, which led into Part One of the book. It seeks to provide a historical context for the changing face of health care, in its transition into and through the Information Age. The chapter sets the scene and provides the basis of the perspective and proposals of Part Three of the book, which is concerned with the goal of creating an information utility that can meet and sustain the evolving needs of health care in a wished-for, more mature and settled future Information Society.

Thus far, health care services have tended to bend to the limited capabilities and exigencies of embryonic and immature information technology. The challenge today is to refocus attention on the values, principles and goals of health care services, making use of today’s considerably different and increasingly mature information technology, to live up to and improve on these, while continuing the exploration of new needs and potential that arise. As of today, the words of national information policy continue to mirror much of what was set out fifty years ago, dressed now in the dazzle and hubris of contemporary discovery and hype. Meanwhile, throughout the National Health Service (NHS), especially in remote locations far from London and other major cities, teams struggle with obsolete desktop computers and user interfaces, by far lagging those that they use in their personal lives at home. How can efficiency and improvement be truly the focus of policy when basic tools of personal productivity, available now, remain withheld, and much of the resource for innovation that is available is focused on futuristic ambitions of yet unknown efficacy and efficiency?

The next chapter, Chapter Eight, which leads into Part Three of the book, focuses on the changing nature of health care and an information utility matched to its evolving requirements. This is turning the world upside down, from what has been described as an Industrial Age preoccupation with disciplines, professions and institutions, to an Information Society focused on citizens and professionals, and their co-creation of health care in the communities they live in and serve. This brings a new perspective on roles and responsibilities at all levels, from the local to the global, with new focus on the balance, continuity and governance of trusted services, and on teams and environments capable to lead and deliver them, including in support of education, research and innovation.

To be brazenly provocative for a moment, just to highlight the challenge and cost of wasted opportunities: we have sometimes spent ten times too much, badly, ten times too slowly, and achieved a tenth of what we can and must now realize from investments in information technology if we are to emerge from the past decades of information pandemic in health care. A paper from the Humana Foundation, highlighted in a recent report on health care trends by the Deloitte Consultancy, concluded that a quarter of expenditure on health care in the United States is wasted money.2 That in a health system that spends much more, and is rated to achieve rather less, overall, for its citizens than international comparator systems. The report signalled a major reorientation of expenditure over coming decades, away from ‘process and money’ focused services to ‘outcome and value’ focused services, very much in line with the vision I am peering towards in Part Three of the book.

In Chapter Eight and a Half, mirroring Julian Barnes’s parenthetical Chapter Eight and a Half in his A History of the World in 10½ Chapters, I describe major initiatives into which I have placed much of my personal creative efforts of the past thirty years. For me, these hold the key to reimagining the current Pandora’s box of health informatics, to support an oncoming reinvention of health care services. They hold important lessons about how we should set out to make and do things, as much as about what we set out to make and do. In saying this, I fully recognize and welcome the fact that such ideas must become embedded within viable and successful supporting businesses, as well as in new health care services. I, myself, have not been a person sufficiently interested or capable to give such a commercial lead. I have, though, worked without financial reward, to support and collaborate with people brave and competent to do so, and been fortunate to have had role and remuneration from academic employment to enable me to do so. From this position, I have all the while argued for and held to a vision of the common ground on which I believe future commercial endeavours must be based, if they are to succeed in their mission of supporting a now essential programme of reform and reinvention of health care, matched to the evolving needs of the coming Information Society.

The question then arises as to how to create and sustain an information utility which serves the wishes and needs of citizens, by achieving greater and enduring rigour, engagement and trust in health care information and infrastructure, based on standards of consistent, coherent, affordable, well governed, safe and sustainable systems. This is the theme of Chapter Nine, which ventures into the sometimes-contentious world of Creative Commons, open standards and openly shared tools, methods and software.

Health care services seem always to be a work in progress and in an agitated state of flux. Circumstances, and ways of thinking about them, change continuously, as do political ends and means for achieving and financing them. If the reorganization of health care services at a national level might be compared with passage through the gate described in T. S. Eliot’s (1888–1965) poem Little Gidding, it seems that we have gone full-circle several times, seeing the gate anew each time.3 From the centre of government, it is inevitably a high-level view, as if from a helicopter circling above the fray. A bit like the image of President George Bush, filmed viewing the Katrina hurricane-induced floods from the encircling Airforce One presidential jet!

In contrast with the poet, we cannot reasonably claim that, based on experience and learning gained in each circuit, we are seeing the gate more clearly, as if for the first time. We are seeing a different gate in different context. Some of its structures are old and some are new; changing times rot and weather them. Some of them are hardy and others less so. Maybe different materials would fare better, but the downsides of new materials arrive with them, too. The enduring thought and perception after each circuit remains, however, of a rickety gate in need of fixing. We should always strive to make things better and more equitable, while recognizing that life itself tends towards becoming a rickety gate, and that health care services cannot always fix them!

This chapter now traces the recurring dilemmas about health care experienced through the Information Age, alongside the social, scientific and professional contexts of their times, the advent of information technology, and the information revolution it heralded. I draw on my childhood experience of social care and my career-long engagement in academic and professional communities of health care around the world. I start with some memories of health care in my childhood, revisiting the remote English village life I lived then.

Village Medicine–Snapshots from Earlier Times

The detective in Agatha Christie’s (1890–1976) novels, Miss Marple, was an amateur detective sleuth who lived her life in a small village. She claimed that all human nature was revealed in observation of village life, and this was all she had, or needed to go on, in solving its crimes. Village doctors did not have or need a lot more, either, in diagnosing its illnesses.

Two of my great aunts lived in a tiny village, Hawkesbury Upton, on the Cotswold Hills in rural Gloucestershire. The family ran the village shop, which doubled as the pharmacy and bakery, and their uncle was the village doctor. The shopfront window had large glass flasks on display, each filled with a different coloured water, the trademark of a pharmacy. The doctor’s surgery was immediately behind the shop and the family lived in a small cottage, in a row of them behind the shop building. Each cottage had a large back garden, and a small driveway led from the front of the cottages out onto the road, for horses and carts–originally no electricity and no cars, of course, just water and limited local drainage. In their living memory, relatives had walked the twenty or so miles, to and fro to the city of Bristol, to sell their wares in the markets there. I recall my aunts’ mention of the famed Dr. Jenner, in their stories of village life.

Edward Jenner (1749–1823), the pioneer of vaccination and founder of the science of immunology, had lived and worked nearby in Berkeley. He trained in London at St George’s Medical School. In his early village life, he observed the immunity conferred on women milking the cattle, immunized by their close contact with cowpox, against infection by smallpox. He had himself been painfully inoculated with pus collected from patients infected with smallpox. This led him to conduct experiments on combating smallpox through vaccination. A person of very wide scientific interests, he devoted much of his time to development of the method.

Smallpox is thought to have emerged ten thousand years ago in Africa, then spread to Europe in the fifth to seventh centuries. It was frequently epidemic in the Middle Ages and was taken to the Americas, by the Conquistadors, and spread elsewhere around the world. A spread occurring over centuries, that nowadays occurs in weeks and months. In 1797, Jenner submitted a paper seeking to alert the Royal Society to the importance of vaccination, but the idea was rebuffed as too revolutionary, and he was told to go away and do more work. There was a powerful anti-vaccination movement in those times, too! Vaccination was subsequently recognized as of huge benefit to the country’s health, but Jenner did not pursue it for his personal gain–his income from other sources suffered as a result.4

My family often visited these great aunts as they lived on into their mid-nineties, hauling themselves up and down the very steep staircase in the cottage, cooking on a coal-fired range, gardening and talking. My grandmother from the same family had diabetes. Equipped with tiny weighing scales and spirit flame to sterilize insulin injection needles, she showed us grandchildren how to manage her medication and diet. Pneumonia struck and killed her, in the same village, in her early eighties.

In their childhood, my great aunts told us, patients would come to their doctor uncle, for such insight, advice and medicine for their ailments that could be provided, and often comfort and encouragement was the most useful. But there was trust and expectation for medicine and cure, and this had to be addressed, too. At the completion of the consultation, a note or prescription was written out and sent through to the shop. Among these, they said, was sometimes a request to dispense doses of ADTWD, which was duly acted upon. Laughingly, they joked that this stood for Any Da… Thing Will Do!

It sounded a bit harsh and unkind to our ears, no doubt, but it was probably not so much dismissive prescription for the worried well as rueful reflection of the reality that the problem was beyond medical scope or means. It is easy for a stage play to laughingly dismiss the craft of the physicians managing ‘the madness of King George’, and his unknown-about porphyria, by extraction of vapours, or regulation of high blood pressure by letting blood, but ways of thinking about illness and attempts to combat disease are very much the art of the possible, in time, place and wider context.

There are and have been many doctors in my immediate family. I wrote in the Introduction about my polymath uncle Geoffrey, a casualty surgeon, as the Emergency Medicine speciality of today was then known. My mother’s other brother, Jack, a general practitioner (GP), sadly took his own life at a young age. Medicine can be a very tough profession. Those of today are from a different mould; no longer living on or placed on pedestals. The medical arts have been demystified in the Information Age, while the expectations placed on them have grown.

I recall another family visit in the garden of my great uncle Edwin, a retired GP, at his house in Southsea on the coast of Hampshire. My father was a keen gardener. He gardened on a large scale, feeding twenty-five children and staff from a huge, partly walled kitchen garden in the twenty acres of land belonging to the children’s home we grew up in, run by my parents. He produced enough to send to other children’s homes in the county. I can see us, now, gathered in the garden in Southsea and Uncle Edwin talking about his life as a GP. He showed us the device used for excising infected tonsils. In early times, this was commonly done by a GP, with the patient, usually a young child, lying chloroformed on a table in the surgery. He pulled up a cabbage and demonstrated the procedure by chopping off the stalk. I can still see that image in my mind. I was six years old and had recently had my own tonsils removed, after a long period with persistent sore throats, so my experience of the episode and the post-operative pain was still fresh. Maybe he was thinking it might make me realize that I had come off lightly!

My own most serious childhood brush with acute medicine was a major concussion and bruising, remaining unconscious for a long time, after crashing while riding my new bicycle. I had been racing along the long gravel drive from the gate to the house of the children’s home, against one of the other children, who was running. The beloved bike had been renovated and painted by my dad, for a birthday present when I was about nine years old. I came to, lying on my parents’ bed with village GP in attendance and worried parents and others around. I was sick, aching, cut and sore, severely concussed and confined to bed for days, and then slowly nursed back to recovery at home. No ambulance, accident and emergency department, or hospital attendance. Just the one GP covering several local villages, his box of tricks, bed rest and my parents’ care at home.

In everyday life, care was largely based on domestic skills and country folklore, gargling salt for throat infection, inhaling menthol vapour for colds and lung congestion, and taking aspirin for pain relief. My great aunts took half an aspirin tablet every day throughout their adult lives, as anticoagulant, they told me. In my village, such remedies were available at the small village general store, which doubled as post-office, bakery and grocery, although most families grew vegetables. Village communication centred around the primary school for fifty pupils, church and church hall, pub, sweets shop, farmyard, woodyard and the village bobby’s (policeman) house. School dentistry came in the form of a dental team, who extracted numerous rotten teeth in their mobile caravan-based surgery parked in the school playground. Those awaiting their turn for so called ‘laughing gas’, were not laughing, but subdued.

Health Care Services Today

My childhood village is no more; it now has quite different global contexts and connections. Geography no longer functions in the same way as a moderator of information, service, expectation and demand. And health care services today are more complex, beyond recognition. They are separated but not separable; managerially segregated more than integrated. The village health services of my childhood were largely centred on care, as when dealing with my severe concussion after the bicycle accident. They are now more heavily focused on treatment. Resolution and palliation of exacerbated chronic back pain, in city and village today, is predicated on access to and length of queue for physiotherapy, X-ray or magnetic resonance imaging (MRI) scan, surgery and tolerance of powerful analgesics. Regular exercise classes to guide and support are mostly out of scope, save for those who can pay.

The fragmentation of efforts to treat and care has highlighted and exacerbated the difficulties in maintaining resilient balance and continuity in what is done and sustaining ethical governance of the services and technologies employed. Especially so for services working at the interface of mental health problems, physical disability and what are termed illnesses of poverty. With increasing range and effectiveness of interventions have come increasing needs for care, especially in relation to chronic and incurable diseases, and lengthier old age. Much of the caring load is shouldered by families and friends at home, and by the goodwill of neighbours and community volunteers.

Aftermath of War and Seven Decades On

In the UK, experience and attrition of the Second World War was followed by years of hardship in the reconstruction of economy, buildings and lives, buoyed by a spirit of relief and hope for the future. The cost and destruction of wartime created a new ground zero. It opened the way to radical new thinking, with openness expressed through mutual trust in common endeavours.

The hope for transforming change was notably stimulated by William Beveridge’s (1879–1963) report on social services, that had been published in 1942. Politically and professionally contentious at the time, but striking a chord in the country at large, this advocated and came to underpin the reframing of health care services. Its wider focus echoed the social deprivation experienced in the years of recovery from the 1914–18 war and the economic collapse of the 1930s, that impacted and influenced my parents’ lives. My father’s brother, once successful in his work, never recovered zest for life after many years of unemployment and poverty. The report was a powerful signal that shaped policy of the early post-war years. It used graphic language to describe the need for battle on five fronts–want, disease, ignorance, squalor and idleness. Elimination of poverty, a national health service, universal education, good housing and full employment were adopted as essential elements of national reconstruction.5

The UK National Health Service was established in 1948, when I was not yet three years old. It was thought of as a central organization of the professional practice of medicine. Nurses and nursing care were generally thought of as subservient to doctors and medicine, in both gender and professional terms. A generation of young men had died or were severely disabled in warfare and this loss echoed sadly in the lost life opportunities of many women of those times, and of returning soldiers.

In later decades, and battle-scarred in his efforts to promote international focus on climate change, my university physics lecturer John Houghton (1931–2020), who died early in the pandemic from complications of Covid-19 infection, wrote that humankind might only take such major issues seriously after experiencing a disaster. In the information era, advancing technology has increased the potential scale, spread, impact and cost of destructive human-made mess-ups. The experience, today, of disease and threat to livelihood in a viral pandemic may also prove a spur to new thinking. It is in no way the same experience as armed conflict and deprivation of wartime, but in the response to the fears and uncertainties of the times, expressed through mutual support within close neighbourhoods, there is similarity.

In 2020, when for the first time there were more people aged over sixty-five than under five, David Goodhart’s characterization of the social and political crisis of today is again radical in its thinking.6 His diagnosis is of an accumulating underlying imbalance of head, hand and heart, in social, economic and political life. Goodhart observes that society has split between poles of globalism (characterized by what he calls ‘anywhere’) and localism (this characterized as ‘somewhere’). He describes imbalance in social status, value and reward accorded to the contributions of all citizens, reflecting head (cleverness), hand (skill in making and doing) and heart (care). His anywhere and somewhere are metaphors for interacting global and local contexts, that play out in people’s lives. The curriculum of medical education, today, emphasizes integration of knowledge, skills and attitudes, mirroring Goodhart’s triangle of head, hand and heart.

The 1942 Beveridge Report and the 2020 Goodhart book combine observation of ailing society in two different eras with account and reasoning about how these came about and what needed to be, could and should be done about them. Diagnosis and prescription of treatment for ailing society and for an ill patient bear some comparison.

In a clinical setting, with a patient who presents as sick, the professionals’ goal (on which the patient tends to concur!) is to help them cope and get better, as best possible. Easily said–sometimes clearly, straightforwardly and quickly achieved, but often not. Treatment and clinical management goals set, actions taken and their reasoned basis articulated, evolving context monitored, progress made and outcomes resulting: all of these provide evidence to inform the review of what was done, how and why, and possible need for adaptation and change of approach–maybe more of the same, or a different medicine, and maybe less.

In clinical practice, failures tend to disappear out of focus. Patients die, problems of acute concern are resolved, or they dissolve into longer term concern for the effective treatment of chronic illness, adjustment of lifestyle and supportive care. They move beyond clinical professional scope into scope of the coping ability and capacity of patient, family and their local community, and both local and global support services available to them. In society more widely, failures of health care policy may lead to crisis and breakdown, persist, adapted to or unchanged, amplify or decline. Global policy and decision makers perceived to have failed or to be no longer relevant, lose credibility and power. Wider ailments of society become local problems of personal health care–the Beveridge giants, and the Marmot inequalities of health. Policy for health care easily goes astray in the noise and bias of changing times.

Chaotic presentation of illness in a patient has first to be assessed, and immediate priorities coped with, before underlying problems identified can be treated and managed clinically–usually, the earlier addressed the better. Health care starts with patients, family and community. These people are on the frontline of early awareness and experience of the signals and noise generated by the onset of disease. The health care systems must first connect with, cope with and reflect that reality, and be demonstrated and observed to do so. The global and local realities of health care policy, systems and services need to cohere–and be seen to do so, for citizens and professional teams alike–if they are to prove efficient and effective, both in deciding on and achieving their goals. Beveridge, Goodhart and Marmot attest that they do not cohere. The advent and anarchic patterns of adoption of information technology have played a significant part in both revealing and exacerbating this situation.

In the light of recurrent failure, questions arise and persist concerning not only what was aimed for, but also how it was approached and whether it has proved to be, and remains, a realistic goal. They persist in the context of information policy for health care. What and where is the common ground on which citizens, local communities and professionals engage? What and where is the common ground on which health care systems and services engage? What and where is the common ground of information technology and information for health care?

In such deliberations there are helicopter views and views from ground level. High-level views look further, but less specifically and sensitively. Ground-level views may not see beyond the reality that lies nearby in their focus of interest, and can thus obscure, dominate or preclude wider perspectives. Economists talk of macro- and microeconomics. Macroscopic focus is on whole systems, broad brushes and big picture, and head-up overview. Microscopic focus is on parts of systems, fine details and the hard, head-down graft of coping with and implementing action, close at hand. They may be pursuing the same or quite similar goals, in one way or another. One is mainly about what is sought, the other mainly about how to achieve it. These are matters of head and hand, and success often depends on a good heart. Head, hand and heart cannot always be balanced but they need to connect how macro-level goals are tackled at the micro-level. Where they fail to do so, they easily stir angry feelings on all sides.

An incident comes to my mind, involving a rather cantankerous professor of surgery, whose weekly ward round I was invited to attend in about 1969, as I mentioned briefly in Chapter Five. I was also invited to attend an operating theatre, to observe the innovative open-heart surgery of the times, made possible by extracorporeal blood gas exchange. I have vivid memories of those wavering first encounters with acute medicine services! The surgical professor specialized in a technique of gastric surgery that severed the vagus nerve, to treat patients suffering from stomach ulcer–a common approach, then, in combatting the erosions stemming from stomach acidity.7 He approached the bed of a clearly very unwell and seemingly very depressed patient he had operated on a week before and enquired of his wellbeing. Seeing and hearing the patient’s considerable distress, he offered crisp words of sympathy and turned quickly to the ward sister, suggesting she might offer him a glass of sherry each day! Walking to the next bed, closely followed by his senior registrar, he turned to him and said, loudly and angrily, that he did not expect to see that patient still there at the next ward round, and to ‘get that patient well!’ I can see the scene in my mind as I write.

Those were different times and more normalized to what, for us, seems a chillingly autocratic, archly detached and ‘Doctor in the House’ manner in directing clinical teams, but the relevant concern was, and remains, how? This situation, in microcosm, is what can easily happen with health care. If one cannot cope, another one gets reprimanded, and rides the punch as best they can. People do get angry when even their best efforts and intentions run aground. As with the irate professor, intractable challenges give rise to a good deal of anger and finger-pointing within health care–from the top floor of the NHS in Whitehall and its politician and managers to the most remote parts of the community served. As with the hapless senior registrar, teams at the bedside and in the community are all too easily chastized, resulting over time in them losing motivation and sometimes, themselves, falling ill. It takes considerable and invaluable dedication and balance of heart and mind to steady the hand and keep going. Those facing these situations can easily become like the depressed patient in bed. Senior doctors in that long-ago cantankerous professor’s team told me that he drank heavily in his office at work and operated unsafely. He was maybe depressed, too–he died quite young. Health care, like teaching and policing, is a tough profession–tough to organize and tough to cope with.

The Information Age is revealing the inequalities and imbalances of health care in a new light. To the extent that computerization fails to engage realistically with their resolution, it exacerbates them. To be a creative agent of their resolution within the oncoming Information Society, the information utility this book is arguing for must be conceived and created as a balance between local and global services, and the needs and experiences of those they serve. Resilient balance, continuity and governance are central themes it must pursue. BCG (Bacillus Calmette–Guérin) vaccination was highly effective against the tuberculosis epidemic. Information utility must focus on another BCG–Balance, Continuity and Governance–to counter the current plethora of unbalanced, discontinuous and unregulated sources of information. Citizen engagement, professional teamwork, education, innovation and professionalism in health care services are all in need of the common ground that a good information utility can enable and support.

The National Health Service

In Medical Nemesis, Ivan Illich (1926–2002) described how in the fervour of the French revolution, it was promised that liberty equality and fraternity would banish sickness–a national health service would take charge.8 The promises of the UK National Health Service were lofty ideals, but not quite that elevated!

The founding father of the NHS, Aneurin Bevan (1897–1960), was a beacon in my parents’ lives. They were strong believers in the Beveridge and Bevan missions and relied and acted on this belief, thereafter. As mentioned before, my dad left school at fourteen–his father had disappeared to Australia and his mother died of cancer when he was a teenager. My mother left her domestic science college, and a subsequent period on the staff at Gordonstoun School (including looking after Prince Philip when he was a schoolboy there), to head off to Catalonia to look after refugees from the Spanish Civil War. The ever-changing landscape and experience of health care services over the following decades bemused and upset them in equal measure as they grew older. They experienced the evolving science and technology underpinning its methods, and the professions and organizations delivering its services, through the decades of challenged and changing post-war society. It was a kaleidoscope of images and feelings–gratitude, trust and hope mixed with growing experience of incoherence, inconsistency and disappointment.

From the Beveridge Report had emerged the policy and plan for a comprehensive national system of social insurance ‘from cradle to grave’, paid for by working people and providing benefits for those unemployed, sick, retired or widowed. It laid the foundations of the NHS in a society where medicine was less capable, diseases more often short-lived and ageing more rapid. Childcare, and other care services, such as convalescent care, were organized in residential settings, such as the children’s home where I grew up. Mental hospitals provided last resort containment of the uncontained or uncontainable problems of mental illness. They were awful places to experience. I did so in a volunteer work camp in my teens, making a garden for the residents to enjoy, and, in later years, visiting family and friends unlucky to need their care.

In this policy framework, power over the management of the health system and practice of medicine became an increasing concern of central government. Specialization of services, and their associated professional skills, increased alongside advances in knowledge of disease and availability of effective treatments, reinforcing the case made for centralization. Specialism and its associated research, education and regulation could only advance, and be afforded, for the whole population, when organized at district and regional levels and in national centres.

Specialist services are primarily acute and episodic in nature. Even people living in a remote village can and will, for a time, be able to accompany a child being cared for within the unique and necessary environment of a national centre.9 But it is not home, and life must continue at home. There is a natural wish for acute services to be conducted as close to home as possible, and this brings tension–communities campaign for and defend against loss of nearby hospital facilities. Effective acute services delivered at or near home are much desired. A better balance of hospital and home is increasingly within scope of the Information Age.

Social care services are, by their nature, longer term and predominantly less specialized. They are needed, and need to operate, near to home. They have nationally defined frameworks that guide and support good practice, but the management of these services rests with local government. And the disjunction of operation and governance of health and social care services is destabilizing and inefficient. It has reflected lesser social and professional status of care services, seen as priority of heart more than head. Care services are matters of hand, as much as are those of the surgeon’s hand, but, as with many such skills, in Goodhart’s view, not valued as such. A patient looked after with thought and kindness may cure themselves of a stomach ulcer and not require surgical or pharmaceutical intervention. Their health and life may never fully recover from laparotomy and section of the vagus nerve or may experience harmful side-effects from prolonged drug treatment. The cost and benefit from helping a patient cope with and find resolution of a stomach ulcer in a caring community and home setting, with due diligence that nothing more sinister is evolving, and the cost of the potential medical and surgical alternatives, bear no comparison. The achievement of such a win-win scenario for patient and health system will depend significantly on a better connection between these different worlds of health care–a more individual citizen- and patient-focused information utility will be central to the pursuit of that goal.

A text-rich diagram showing changes in the UK health service. It reads: 1846: small volume, low profile, general acceptance / from 1970’s: Studies in variations in care: e.g. D&C (13-fold), hysterectomy (5-fold), endoscopy (7.8-fold), cholecystectomy. Pressure on resources / 1996: large volume, high profile, critical scrutiny. Reasons multi-factorial: pressure of medical advance, recognition of variances in care, public/patient pressure. J. Swales, NHS Director of R&D, 1996.

Fig. 7.1 The changing NHS from 1946 to 1996–after a lecture of John Swales, Head of the NHS Research and Development directorate, 1996. Image created by David Ingram (2010), CC BY-NC.

In celebrating the coming fifty-year anniversary of the NHS, halfway through my career, the then head of Research and Development of the NHS, John Swales (1935–2000), gave a seminal lecture at St Bartholomew’s Hospital (Bart’s), charting the changes in the service since its inception. He was a colleague Professor of Medicine of John Dickinson, and came there to conduct final examinations, I recall. They shared an interest in the aetiology and treatment of hypertension. In the lecture, a slide from which I made notes (see Figure 7.1), he observed that medicine at the outset of the NHS might be characterized as small, with low public profile, and enjoying general and uncritical acceptance–its interventions often being relatively ineffective but harmless. By comparison, fifty years on, it was much larger and more effective, while, at the same time, at greater risk of doing harm. It was a high-volume service and under much increased critical scrutiny. He highlighted the pressure of medical advance, variance in pattern and quality of care provided, and public and private pressure as underlying this changing pattern.

Risk and safety have become overarching concerns in medicine. As the tools and methods available have increased in power to do good, they have also increased in power to do harm if misapplied, either by misguided design or unlucky accident. New balances of risk and opportunity, cost and benefit, unfold alongside innovation. As we saw in Chapter Five, new ideas and designs often stretch the boundaries of welcomed, accepted, and trusted practice. Engineering experience and expertise is central in partnering scientific creativity and focusing and guiding its fruits to useful ends.

There has been significant change, also, in the demographic diversity of the UK population. Diversity and inequality of health care among different socio-economic and ethnic population groupings, has emerged more clearly as of major significance and concern. It manifests in prevalence of disorders, effectiveness of interventions and inequity in their provision. It has many determinants and Marmot at University College London (UCL) has been a formidable champion of this important research and advocacy. In another dimension, there is considerable genetic diversity–as studied by my colleague Bernadette Modell in the context of her pioneering work, centred in North London and now influential through the World Health Organization (WHO), internationally, which I profile in Chapter Nine. She focused on integrated community and hospital services for people who have inherited or are likely to inherit haemoglobin gene variants associated with the blood disorder, thalassaemia. The ethnic diversity of this variant, globally, and the diversity of socio-economic stratification in one London Borough, are illustrated in Figure 7.2.

A presentation slide on diversity. Ethnic: Prevalence of Hb gene variants - demonstrated by a graph entitled “% of the population carrying a Hb disorder”, which compares “Carriers, % of population” and the location (Modell, 1999). Socio-economic: within one small area of London - illustrated by a colour-coded map of Camden and Islington (of London).

Fig. 7.2 Maps of ethnic genetic diversity, internationally, and diversity of socio-economic stratification in one London Borough–after Bernadette Modell, 1999. Image created by David Ingram (2010), CC BY-NC.

Since its inception, the NHS has featured continuously at the centre of UK national and local politics. Different political perspectives have battled one another, aligning with different models of how services should be scoped, organized and managed–operationally, financially, commercially and professionally. In recent decades, ideas have swung like a windvane, buffeted by centralizing, localizing, nationalizing and privatizing winds.

It has become a habit to impose additional breaking changes on ailing services. Review and reorganization have led to costly and unproductive waste of time and resource, at onerously frequent intervals. This has increased the burden on services that were already struggling to keep pace with scientific and technological advances, to improve, achieve and sustain continuity of care. Resource needed at the coalface of care has been diverted to new organizations defining, pushing for, managing and regulating change imposed from above. The standard of ward level accommodation on the ground has been let down–in too many hospitals across the country it is old, decrepit and unclean. This is not a conducive environment for treatment and recovery.

Many people work and seek meaning in their lives through service within the NHS: those that engage clinically and in social care, those that provide and administer support services, and those that manage services and their relationships with organizations beyond the NHS. They constitute a tremendous asset. But such appetite for and commitment to the NHS has tired noticeably in recent times. I have seen this when working with clinical colleagues, close-by to wards, through our clinical family’s experience, and in visits to family and friends being cared for.

These endemic problems of the NHS reflect its scale, range and diversity. It encompasses both laudably leading, and unacceptably and worryingly wayward, facilities and services. Viewed as a whole, governance is also fragmented and unwieldy, hampered by the separation of health and care policy and practice. And as a result, information systems are unfitted to connect with and respond flexibly to local health care needs, and advances in underpinning science and technology. As with the super-tanker delivering oil that cannot slow or change course for many miles, even with the application of maximum thrust generated from its power source, it is set in its course. It would be an unimaginable nightmare to reroute the delivery of oil by running a full tanker aground, mopping up the spillage and sending for another one! Repetitive rerouting of the delivery of health care services is a perilous course.

It is not good sense to project the responsibility for this complexity onto information technology (IT), either as cause or panacea; it is and can be neither. IT mayhem typically reflects poor understanding, inadequate capacity and capability, and poor practice. As a colleague sitting on an overseas national policy board for health IT remarked to me recently, its members know things are not good, have a limited sense of why, have little idea how to improve, but above all fear that what they decide to do will prove a mistake. It is little surprise that senior health managers tend to view too close a connection with IT as career suicide!

When I sat for a period on the equivalent national IT board for the NHS–populated it seemed, mainly by battle weary and sometimes rather cynically resigned managers–I was quietly removed for introducing what I saw as root causes of their dilemmas, that were perhaps too difficult to hear. Or perhaps my thoughts and ideas made no sense to their ears, and they knew better. The then Prime Minister was persuaded to commit many billions to a programme of investment for which the basic tenet and promise from the industry–in simplest terms, pay us enough and we will do it–proved substantially unsound. I watched this path develop in surreal meetings of consultants, companies, health managers and bemused IT departments in our local health economy around UCL. The consultants and companies were very well paid, the real costs in the health economy were hidden, and those keeping services running were distracted and wearied. It was not all bad–some good infrastructure did emerge, but too distant from the direct support of patient care that was needed.

Given that the NHS, corporately, speaks of itself as a learning organization, it is amazing that it retains so little knowledge of past policies and programmes which have gone through multiple groundhog days of rebooted information strategy. Some things are, no doubt, best forgotten, but some of the experience could and should have been learned from, better. The root cause of the problems of IT legacy and underperformance in health lie in inadequacy of method, capability and culture–all problems of connection. They are all problems in need of a good and proven answer to the primary question: How? Answers to What? and Why? questions are endlessly rehearsed and reframed, with little added meaning, as evident from the track of policy statements over fifty years that I lay out below.

The how answers proffered to entice the opening of the public purse were typically slides and spreadsheets, presented by people who had never designed, implemented and operated such systems. Those listening to proposals for significant scale of action pursuant to policy implementation should be guided by a principal imperative–to be informed and critical when reviewing what is being talked about and promised. The response to the challenge of ‘show me’ was, too often, like pointing to a car at a motor show, hidden under covers before launch, to appetize but not reveal. And in any case, there were few experienced mechanics around, to recognize and understand an engine under a bonnet when they saw one. The who, when and where answers were a mixture of assumption and delegation. The domain had become a minefield of political, commercial and managerial mayhem.

It is a good goal to seek to use IT to help and support people, services and organizations as they adapt and grow, locally and across the now almost fully connected world. Wisdom in pursuing this would be to recognize a new culture of information as an organic entity, best served by identifying ways to nurture and help it grow well. How we do that matters as much as, if not more than, what we do. The only route to doing things better is to learn how to do them better. The NHS should have identified and learned from IT pioneers, who were already charting the way and building it around them. In practice, it marginalized them and placed its trust in hubristic and unmet promises from people and vested interests of less relevant track record, placing its chips on squares where burden was often added, more than relieved.

Balance

There are many kinds of balance and imbalance in play in health care. They reflect the complex contingencies of individual, family, community and environment. They span personal, professional, scientific, social, economic and political domains, and changes among them over time. Imbalance and inequality of health care have persisted and become further highlighted in the anarchy accompanying transition through the Information Age.

The human body is an autonomous entity, evolved to preserve homeostasis–resilient immediate bodily balance and long-term sustainability. Medicine is focused on supporting or restoring this homeostasis. Accidents and disorders of all kinds disturb and threaten this balance–mutation of genes and accident and shock of traumatic events can grow to overwhelm both physiological and emotional balance. There is ever-growing scientific and clinical knowledge of the body’s homeostasis and how best to cope with and treat the disturbances and threats that arise. Metabolic and physical balance and resilience get harder to sustain with age–I know that experimentally at my age, when dance and Pilates are proving amazing examples of how one can help oneself to maintain balance of body and mind, while keeping fit and having fun.

Good health care is a balance of giving and receiving, of what can and should be offered and done in support, and what can and should be expected and accepted, by and from whom. Everyone professionally involved in that balance is bringing themselves, and their knowledge, experience and expertise, to bear on supporting health and providing care. They are often exposed to extremes of human need and suffering of those they serve, and difficult to achieve expectations vested in them, as the supporting professionals. They themselves have special needs. We all need to be cared for and we all need to give and receive care. There is, as ever, a balance of rights and responsibilities.

The centre of gravity, or point of balance, of the expectations and experience of patients and professionals receiving and delivering health care services is not a fixed point and has changed considerably over time, as has the trust that holds things together; very much so in the context of the Information Age. The fulcrum has not easily adjusted in keeping with this change, and health care systems have become overburdened and swung increasingly out of balance. This instability has led to overloaded services and related critically adverse events, litigations and enquiries, reflecting failure, dissatisfaction and public concern. Inevitably, such imbalance does sometimes ramify in thoughtless, incompetent and uncaring action of service personnel, with potentially unhappy, harmful and inequitable consequences for patients. But the high level of personal commitment of its coalface workforce is a common ground on which the professions pride themselves and the NHS depends, and that workforce sometimes experiences overwhelming personal pressure in delivering and sustaining wished for high standards of care. Health care services look to be at their affordable limits in society. Something more than money is needed to restore balance.

The state of the health care system and the state of health of the individual citizen are connected. Governments may issue White Papers about ‘The Health of the Nation’ but, in their essence, health and care are personal matters. As with clinical decisions about the health of an individual patient, policy for health care services often does not have clearly right and wrong answers. It reflects a balance of advocacy and decision on behalf of both citizens and services. Patient care is nowadays seen, more explicitly, as a balance of a patient’s individual needs and wishes, the roles of professionals and services that are there to treat and support them, and the roles they themselves can and should play, in sustaining and maintaining their own health. Whatever the choices made, decisions reached and actions taken or not taken, there are consequences, for better and for worse. These balances have become more evident and explicit in the Information Age.

Continuity

Continuity of health care services is a central concern in need of closer attention. The anecdotes from my family’s village life were examples of the limited range of what was possible in the countryside, with radio but no television, and with the small community hospital, dentist, pharmacy, library and bookshops five miles away. Very few people possessed the rudimentary small cars of the era and there were only twice-daily buses to the nearby town. The next level of hospital service was centred twenty-five miles away. But there was, despite that, good continuity of care and communication, in community life and through local visit and telephone. By and large, people expected and were expected, to cope as best they could. Villagers feeling ill suffered more than they perhaps should or might have, accepted the realities of what was, and generally trusted in the good intentions of all concerned, with limited expectations and a generally good spirit, in my recollection. Expected lifespan was a lot shorter, of course. There were many inequalities of village life in the countryside, including burdens of disability and poverty, but the lone, multi-village doctor was not in the firing line of people’s dissatisfactions.

Health care services through the intervening decades since then have become more expensively capable and more extensive, specialized and fragmented. Specialized services exist within more tightly managed boundaries of professional roles and responsibilities. Consequently, the patient and their ongoing care and support needs are partitioned across many interfaces of specialism and organization. These interfaces are often seemingly ownerless. Each side has its own image and perception of the world on the other side of the interface. Admission and discharge across the interface seem akin to steps through the magic window of Phillip Pullman’s trilogy, His Dark Materials, between different physical and emotional worlds of bodily health and continuing care.10

ConCaH, which stood for Continuing Care at Home, was a small national organization set up in the mid-1980s by an amazing GP pioneer, Bob Jones, working in Seaton on the southwest coast in Devon. He was a bundle of good humour and immense energy. In the 1980s, we worked together under the auspices of the Marie Curie Foundation, developing a videodisc-based professional educational resource entitled ‘Cancer Patients and Their Families at Home’.11 He asked me to become one of the ConCaH patrons, to represent the potential of IT to transform the then current scene in improving continuity of care.

One of the activities Bob pioneered was a series of one-day meetings, linked with the national Parkinson’s Disease Society, in which he was also active. The purpose was to bring together patients with Parkinson’s disease and the different professionals providing them with health care support, to share their different perspectives and experiences of their services. In the mornings, the professionals each separately described their roles and contributions–secondary and primary care doctors and nurses, community nurses, occupational health and social care teams–sometimes seven key workers for a single patient. In the afternoon session, patients being looked after by these people shared their experiences of the care and support that they received. This led into further discussion and re-visiting of the morning session.

One notable case study was of a family with whom education and employment services were also involved, and their description of a week in which there were twenty-seven unannounced and uncoordinated visits to their home. It was an extreme example but differences in mutual awareness among professionals, and lack of coherent information and continuity of support for patients, was a recurring theme.

There is something reminiscent, here, of the Thomas Lincoln (1929–2016) story I recounted in the Introduction, where the extent of data collection in the management of severe pneumonia correlated with an inability to act effectively to combat the disease. There is also something of the example mentioned in Chapter Four, on the modelling of clinical diagnosis, where information overload was seen to confuse rather than clarify decision and action. And something of the information and entropy thread followed in Chapter Six, descriptive of order and disorder of systems.

Chaotic times can reveal underlying strengths as well as lurking problems. Personal crisis experienced can reveal and engender self-reliance and a stoic capacity to cope, as well as an inability to do so. But the discontinuity of care highlighted in the ConCaH story is costly and inefficient, as well as ineffective and unnecessary. We should not complain too loudly, though. This is a future we have created, and it can be created differently. Edmund Burke (1729–97), the Irish politician whose statue stands outside Trinity College Dublin, and which I often passed by when attending meetings to examine students at the College, wrote in 1770 about the ‘cause of the present discontents’. He wrote that ‘To complain of the age we live in, to murmur at the present possessors of power, to lament the past, to conceive extravagant hopes of the future, are the common dispositions of the greatest part of mankind’.12

Disjoint sources of information foster the noisy complexity of information systems, obscuring signals that need to be seen and heard, and leading to the fragmentation of services that depend on them. There is an example to learn from, of things managed better than this. Within single and more acutely urgent professional domains, managed and treated by one well-led and focused team, services seem more often to be enabled to perform well. This was a key theme of Atul Gawande in his book, Better, where he described his survey and visits to regional centres of excellence in the USA, for patients treated for cystic fibrosis. His goal was to understand what made one better than another, in terms of their organization, leadership and teamwork, and the quality of service they provided.13

In chronic disease management, which interfaces hospital and community-based services and self-care, isolated records have not connected well, with gaps persisting and little continuity. Such disconnection has pervaded more widely in the Information Age. On average, of the order of twenty percent of professional time, it has been estimated, is spent in managing information. Assembling good and useful data costs time and effort, of course, but twenty percent is a considerable overhead and imposes significant operational burdens on teamwork. If the information systems being fed are not well-tuned in support of health care needs, the loss of resource and capacity to treat and care is disabling–potentially making things worse, overall, not better.

Governance

Governance, which I take to include issues of professional ethics and regulation as well as oversight of services, has assumed heightened importance in the Information Age. The needs and legal requirements to handle personal data confidentially and meet more exacting standards for demonstrating the effectiveness of services and safety of medicines and devices, have multiplied in recent decades. Professions are nationally regulated. Medicines and devices are overseen by national bodies, consistent with international agreements which govern the markets and industries that supply them. Information systems are slowly being assimilated within this framework. Health services embody a mixture of local and national governance and accountability, within the organizations that manage them and the communities they serve. Each level of governance has different requirements and views of operational records of care. Health services are, in the main, governed nationally, high up, and care services locally, low down. Both are experienced locally, by patients, carers and professionals. There is one operational reality to observe, record and account for. There are many and inconsistent accounts, which can then easily misrepresent and confuse.

Regarding local services, the situation is well symbolized by the Escher lithograph entitled Up and Down (1947; discussed in Chapter 6). The world as a small boy sees it, looking from low down, and the small boy as the world seems him, from high up. The small boy on the ground might be a patient. The viewer above a governor or regulator, of one sort or another. The join of perspectives is seamlessly fantastical.

Systems of governance seek to provide balance and continuity between the dual perspectives represented in this image. For this to succeed, they need trusted common ground on which to operate. To be effective, this requires coherent data collected with a minimum of burden on practice. Critical incident reporting across the NHS, involving thirty different formats of data collection, is not a happy state-of-affairs, as I refer to in the section below on the ‘wicked problem’ of health policy. It reflects a general lack of coherence of data and record, and that has become progressively unmanageable and ungovernable.

People do not doubt the good motives and intentions in play on all these levels. But the picture one sees and the evidence one collects depends on where one is looking from and the spectacles one is wearing. It is a picture in which the viewer is also a participant, as the Escher Up and Down lithograph depicts. The boy pictured looking from bottom up, sees himself in the picture seen looking from top down. A similar illusion features in Print Gallery (1956), where the picture is of a boy looking at a picture in a gallery, which morphs, as one’s eye moves (through top left, top right, bottom right, bottom left) into a picture of himself, within the gallery, viewing the picture.14

The experience and resources on which all services draw, is best shared efficiently, cooperatively and collaboratively, and supported with coherent data, not beset by unnecessary duplication of effort. This requires common ground of information systems, not just a postal service between different ones. Although that too can be useful, it is not enough. Policy failure at this level reinforces damaging professional and organizational boundaries, and the failure of resilient balance and continuity of services for those in need. The information utility required can only grow from this common ground. Common ground is open ground, and it is governance of this open ground, on which we need to focus, as we progress from Information Age to Information Society.

From Local to Global Village

There were advantageous characteristics of the village community of my childhood, which it would be good to see revived and renewed in the global village community of today. A principal aim of Part Three of the book is to propose and show how an ecosystem of health care information can be imagined and created, to meet the needs of the global villager and village, as a utility that realizes and adapts to new benefits achievable in the Information Age and avoids falling into new bear traps. This is a tale of two villages.

I have lived in both these villages. The first was the tiny village of my childhood, which I have already described–let us call it Localton, with its nearby local hill of challenges faced in everyday life there. A mountain of wider national challenges loomed from afar but were only sparingly connected with the local hill that dominated and most affected local lives.

The nineteenth-century pattern of remote and isolated village life of Localton is no more in the English countryside, but still lived in much of the world. In my great aunts’ village, movement between villages was conducted by horse and cart and Shanks’s pony (on foot), as described in Chapter Five when discussing the arrival of information technology. Night-time reading was by oil or gaslight. Roughly laid out paths made for bumpy rides on early bicycles, arriving from Germany. Discovery and new means of travel and navigation between countries had been arriving slowly within local awareness, over hundreds of years, through conflict and commerce. Mass transport by ship, car and aeroplane, and their enabling and supporting infrastructures and regulation, started slowly and then arrived in waves over a century or so.

The second village is the city village in which I now live, called Fleetville. It is a distinct and now quite prosperous central area of the city of St Albans, having once been a poorer area, housing families working in the factories nearby. I will call it Globalton. It is in some ways quite like the Localton of my childhood–almost all daily needs within walking distance–but very different in its transport connections and links within multiple global virtual communities. It has its local hill of challenge in everyday life–keeping the community centre alive, regulating local car parking–but is immediately connected, through the Internet and other media, with the global mountains of challenge further afield.

Characterizing such village community today, as described in the Sunday Times newspaper yesterday, is central Walthamstow in East London–of interest and memory to me as it is where my father grew up. This kind of village is now described as a ‘twenty-minute neighbourhood’, and comprises:

  • Home, children’s play areas, amenity green space, bus stop–within walking distance of five minutes;
  • Shops, bakery, butcher, cafes, nursery, pub/restaurant, hairdresser, primary school, village green, elderly day care centre, medical centre, community allotments/orchard–within walking distance of ten minutes;
  • Employment opportunities, workshops, shared office spaces, secondary school, gym and swimming pool, sports pitches, large green spaces in woodland–within walking distance of fifteen minutes;
  • Multifunctional community centre, business academy, college, bank, post office, place of worship, garden centre–within walking distance of twenty minutes.

A work in progress, opposed with scepticism and objection five years ago, and still no doubt with teething problems and troubles anew. It has been pedestrianized, and cars are owned by only forty-nine percent of households, compared to seventy-seven percent for the whole of the UK. Although quite small, it has proved viable and resilient in fostering new growth, its community strengths providing a foundation for reconstruction and rebirth.

Goodhart discussed the challenge of finding a new balance of status and reward, between the ‘anywhere’ and ‘somewhere’ of life today. His anywhere is big and global in scope and application, and his somewhere is small and local in everyday life. Globalton is both local and virtual community, and Globalton villagers live double lives. In the everyday, Globalton villagers share and navigate their ‘somewhere’ activities and the challenges of the local hill. In their virtual lives, work and travels, they connect with and engage with village communities elsewhere, and in the ‘anywhere’ challenges on global mountains, experienced and framed more widely. In the Covid months, local group exercise and dance classes switched, with impressive dexterity and success, to individual Zoom participation from the home. Much group activity was transferred online but is eager to return onto local ground.

In Small is Beautiful, Ernst Schumacher (1911–77) used the phrase ‘think globally and act locally’ to bridge responses to the big challenges of global mountain with local community and business, contributing towards their solution by acting on a small scale on the local hill.15 The phrase is attributed originally to the Scots biologist, town planner and social activist Patrick Geddes (1854–1932). Some very successful charitable endeavours have achieved synthesis of this kind, bridging local lives and wider concern to contribute practically to solution of problems on the global hill. The Oxfam and Amnesty International movements have been creative and successful with this approach, although not without their own problems on both these hills. The Internet has been a great enabler of local sharing and support in our global village–WhatsApp groups have bubbled into life along many streets.

The reverse mindset, of thinking locally and parochially, generalizing from problems on the local hill to justify and pursue self-interested action wider afield, has also been empowered in Globalton. The information revolution has harmfully enabled and conflated big and global thought and action anywhere, with small and local thought and action somewhere. Burglars in Localton tended to think locally and act locally, breaking in to burgle on a small scale, somewhere near home. Scammers in Globalton think globally and act globally, to deceive and rob anywhere in the world. One of our credit cards was scammed recently and, in a matter of days, thirteen thousand pounds of fraud had been charged to it in a total of some twenty places across the country! Social activists also think and act globally on major issues of the day and combine forces though social media to focus and coordinate local action.

Matthew Arnold’s (1822–88) book, Culture and Anarchy, was a mid-nineteenth-century take on conflicts of culture in human society.16 I remember reading at school about his take on ‘Barbarians’ and ‘Philistines!’ A hundred years on, in the late 1950s, Charles P. Snow (1905–80) divided cultures between the sciences and the arts. The Information Age has spread and amplified cultural division.17

Some prophets of change envisage and target a future culture of society, characterized by the ‘sweetness and light’ that Jonathan Swift (1667–1745) wrote about at the turn of the eighteenth century–a mature sense of beauty combined with alert and active intelligence! The culture of Globalton is an organic one, growing much faster than the Localton culture of my childhood, within times of Whitehead anarchy. Localton culture was grounded and sceptical. Globalton culture seems more beguiled by and susceptible to the promise of magic bullets; these sometimes get blocked and backfire in their rifle turrets.

In their inukbook on the future of the professions in the Information Society, Richard and Daniel Susskind set out what they admitted was a very wordy and legalistic grand bargain governing future professional relationships with citizens.18 They saw such change as inevitable, with adaptation to the new reality being primarily a challenge of culture, values and expectations. I draw on their ideas in Chapter Eight, in discussion of the shape of health care professions in the future Information Society. There is a similar challenge facing each global villager, in combining their global thinking and action ‘anywhere’ with local thinking and action ‘somewhere’.

These are much the same challenges that Karl Popper (1902–94) addressed in his book The Open Society and Its Enemies.19 There is enduring conflict. Chapters Eight and Nine develop a vision of purpose, goal, method, team and environment for the creation of a care information utility, in the context of the transition from local to global village. Chapter Eight and a Half describes my experience of the past thirty years in working towards that end.

Instability of the Global Village

What promises to integrate and make whole, can lead to instability and fragmentation. The information landscape can become one of isolated power and influence, orchestrated globally from safe, high-up places. And into the intervening gaps and holes, the less powerful and more exploitable can easily fall, be it by default, lack of care, or incautious and foolish intent. The Internet has connected local villagers and village life into a virtual global village community. Between global villages, today, there are heightened inconsistencies and inequalities of health care. And there is heightened awareness of these, and of the global dimensions of the challenges they pose.

Along my songline, the technologies of telephony and broadcasting, and the superseding digital carriers over land, under sea and relayed by satellites, have evolved into high capacity, superfast broadband. The information highway has channelled rivers of information through every village, flooding out across every plain. Mobile telephony has transformed life on a global scale. It provides a seemingly limitless capacity capillary network of information flow, circulating pervasively in the world. And low-level satellite networks promise yet more. The network is good at delivering information; some is vital and benevolent and some harmful and malevolent. It is poor at clearing up information litter and removing its addictive substance and noxious toxin. It transports secret and criminal content that few know to be there.

Governments were left far behind in adjusting to this revolution. In failing to protect and prepare effectively for new governance, they delegated or abdicated power to eager new information barons of industry, commerce and crime. By default, citizens became clients of global information corporations and monopolies. Governments are now struggling to retrofit vehicle production standards, rules and regulations of the road, and means of navigation, amid information traffic that is fast-moving in all directions. It is a scary and unruly place for citizens to navigate–personal survival favours running for cover!

This revolution has transformed and tested the balance, continuity and governance of services. Human relationships and attitudes are also in rapid transition, challenging belief, culture and values. Facebook is the most wonderful of enablers of social life and the most awesome of Faustian bargains in its misuse and abuse. It is part-motorway and part-car. Unlike these, it has risen virtually unrecognized, hidden in plain sight. Global village citizens and burghers were bewitched by, and welcomed in, what was camouflaged and portrayed as a gift. This gift transformed into a magical power to connect any person with anything. It opened, far and wide, a Pandora’s box of unknowable sequelae. It enabled local eyes to see into the global village and this brought new personal power. Local eyes and ears could be conditioned and manipulated, with image and conspiracy planted from far away, in furtherance of wider and unknown powers.

In the information pandemic that followed, education, research and scholarship, trade and profession all started to operate differently. Cooperation, collaboration and supply lines of trade connected, extended and flattened around the globe. And the politics of the global village polis entered an uncertain evolving order and chaos. Thomas Friedman wrote The World is Flat and Francis Fukuyama, The End of History, imagining these new bridges to the future and their implications.20

There is not yet a discernible centre or law of the land in this global village. There is not yet politics fit for such global polis. There is shared intention and cooperation in building the information network, but little power in finding common ground on which to regulate it. Rather, it has become an instrument and battlefield of conflict and interest. The landscape is then left to the exercise of unbridled and arbitrary power. Attempts to shore up existing frameworks of law and regulation have led to artfully less equitable circumvention of accountability.

The Industrial Revolution was a seedbed of wealth creation and human emancipation, however imperfectly and however unfairly, as it was of empire. From the shocks and after-shocks of conflict and disease, and progressive social emancipation, consensus was created and led towards global institutions of finance, enterprise, health and governance, founded on belief in human rights and strong democratic institutions. They trusted in, and hoped to foster, the better angels of our nature, as Steven Pinker described them.

The Information Age is a comparable leap forward and disruption of status quo. It is probing the limits of social cohesion and resilience of the global village life it has led to. This is playing out in the wider context of global disruption and inequalities of climate, economy and politics. These perturbations are of such scale as defies local resolution and such nature that challenges global action. Global viral pandemic, as much as global climate, transcends boundaries and floats under and over drawbridges. Information for health care is pervasive, both global and local utility. It requires adaptable fusion of both global and local architecture, with global and local governance. This is the space that the creation of information utility will populate.

Lifespan, Lifestyle and Health Care

In his third chapter of On the Origin of Species, Charles Darwin (1809–92) wrote about struggle between species:

It is good thus to try in imagination to give to any one species an advantage over another. Probably in no single instance should we know what to do. This ought to convince us of our ignorance on the mutual relations of all organic beings; a conviction as necessary as it is difficult to acquire […] When we reflect on this struggle, we may console ourselves with the full belief, that […] the vigorous, the healthy, and the happy survive and multiply.21

This was his observation and reasoning about the natural world. Survival and procreation reflect the biology of life and the behaviour and circumstances of living. In human society, we might simplify these under the headings of lifespan and lifestyle.

One of my more recently added inukbooks is Lifespan, by David Sinclair.22 It gives a biological context to healthy living and ageing, as seen by a life scientist and clinician. It presents a visible prospect of normal human life that could extend over one hundred and twenty years. It describes the author’s personal daily prophylaxis for keeping at bay the chronic conditions commonly associated with ill health and ageing. A fortunate life today segregates lifespan approximately within a twenty-five-year period of bodily and educational growth, development and exploration, a forty-year period of work and personal and family development, and an indefinite period of retirement, usually up to twenty-five years, with freer scope for enjoyment and fulfilment, before subsequent, hopefully rapid, decline to end of life.

Lifespan and lifestyle are central preoccupations of the lives we live, in the generally healthier age in which we are lucky to be living. Short and long, healthy and unhealthy, happy and unhappy, they are two sides of a spinning coin. These sides connect, of course, and nowhere more immediately and with greater consequence than through information that flows between them.

One side of the coin, that of lifespan, faces towards health care systems and services, enabling and helping citizens to keep and be kept well, and providing intervention and support where and when needed, from cradle to grave. Information that connects citizens with services, is a crucial determinant of their timeliness, effectiveness and efficiency. The other side of the coin, that of lifestyle, faces towards every citizen, individually–towards personal circumstances, activity and behaviour that make for a fulfilled and healthy life. Lifestyle and information about lifestyle cover a wide and open context–nutrition, exercise, housing and personal security, sense of purpose, work and leisure, feelings of enjoyment, caring and being cared for, and trust. Lifestyle reflects environment, personal preference, choice and opportunity.

On both sides of the coin, there are considerations of knowledge, finance, environment, equity and governance. Information and information infrastructure connect within and between the two sides; knowledge as information with causal power to facilitate and deliver care services and self-care, in support of both lifespan and lifestyle. It is an ever-changing and varied scene. Well-customized and accessible information is a prerequisite of improvement in the balance of individual lifespan and lifestyle. Misguided or misused information is a reflector and amplifier of their imbalances and imperfections.

To keep fit and well, some like to puff and pound the streets, cycle or swim, pulsing the endorphins through their body through vigorous exercise. Others walk or stretch and balance the body in Pilates, or practise and create dance. Some read, cook, garden, paint, play, relax and talk. How such a pattern can square throughout a potential future one-hundred-and-twenty-year lifespan, and balance with economy, environment, and personal, community, and wider public health care services, is a puzzle. It is a puzzle faced on differently connected levels–personal, professional and public. As knowledge, capability and capacity change, so does a 3x3 matrix of balances–personal, professional, and public, in rows; knowledge, capability, and capacity, in columns. It is also a matrix of information and information flow.

The Information Age has led to more wide-ranging, and more effective, health care interventions and means to enact them. Capability to deliver the methods entailed and capacity to access the resources required to enact them have become increasingly commoditized, extending their availability and uptake within the personal domain. I can use a pulse oximeter to measure my blood oxygen saturation in a few seconds, with an oximeter that I can buy online. In the professional domain, capability to measure, interpret and intervene has likewise advanced, bringing a new balance of professional skills, roles and team capabilities, and organization of the capacity required to enact them.

Responding to a stroke with the best methods of the day is a logistical puzzle of organizing fast-enough access to treatment within a network of highly specialized centres. For everyday ailments where the body tends to heal itself over time, the puzzle is a balance of online, over the counter, and professionally delivered products and advice. Responding to a high temperature and sore throat may justify use of thermometer and soothing or more purposeful medicament–but deciding how strong and purposeful a remedy should be pursued is a balance of personal, professional and public choices. A sore throat can be coped with, but a recurrent sore throat is another matter. An antibiotic can fix a problem quite quickly, but over-prescribing risks adaptive resistance of infectious organisms and progressive public harm. In the growing domain of management of chronic illness and disability, the professional domain is increasingly overwhelmed by demand. Means and responsibility for monitoring and management of these conditions is increasingly shared and enacted in the personal domain and within the global village community. Not just my granny weighing her bread and injecting daily insulin to control her diabetes, but easier to take medicines and adoption of a calibrated and supported lifestyle change. Monitoring and reporting of vital signs have become commoditized–blood glucose levels, oxygenation and pressure, heart rhythm and more–with recording, analysis and reporting of measurements, medicaments and outcomes on smart-phones.

Recognizing and responding appropriately to the changing scene is a balance of personal, professional and now computational capability and capacity. Information is central to balance and continuity of health care services. It is central to governance within and among personal, professional and public domains. It is central to equity and trust, on which all these depend.

Information Society health care services of the future will continue to rest on evolving knowledge, capacity, circumstance and choice. Their creation requires renewed purpose, policy and political will, mirroring that of the 1940s. It is a puzzle–better thought of that way than as a problem–that can only be tackled on shared and open ground. It should be tackled in such an environment and in such spirit. To succeed in making the picture whole, it cannot be seen as a contest of capitalism and socialism–it must be a fair and productive partnership of their respective motivations and merits.

Not all needs and expectations can be met; there is no rising tide that can float all boats. There will be necessary, but not necessarily welcome, adaptation on all sides, in roles and responsibilities, trust and expectation, in achieving and sustaining balance matched to the changing society of the Information Age. Lifestyle and lifespan press on the sustainability of many balances: work and leisure; production and consumption; climate and natural environment; personal and public equity and responsibility; private, public and social enterprise. Gains in lifespan and quality of life of the past century have depended on increasing wealth, impact of advances in science and engineering, shelter and public utilities. We must create and sustain a useful and accessible care information utility that matches and adapts to this changing reality. How we do this will be a crucial determinant of sustainable progress in reinvention and reform of health care, more generally.

There will be new discoveries and developed technologies that continuously change the context, quality and organization of health care services. Automation and robotics are rising tides. They are building at the interface of science and information technology and promise new space and opportunity to meet some human needs more effectively and efficiently, while avoiding some aspects of the associated work that humans do not enjoy. They proffer a double-edged sword in relation to present day lifespans and lifestyles. Working life as drudgery is dire, preferable only when in exchange for no work and sustenance. Purposeful working life is a gift–as gainful or voluntary employment, as enjoyment and fulfilment, or as enabler of fulfilling leisure time. For some, these trends towards automation will be felt and experienced as receding tides, as they drain away and replace work and opportunity that is important to them.

Genomics science and technology promise early identification and awareness of susceptibility to disease. Conditions that might emerge later in life can thereby be anticipated, guarded against and mitigated from early in life. Hitherto opaque and intractable disease may be better understood, characterized, treated and potentially eliminated. Synthetic biology is a rising technology that promises superior and more swiftly and systematically developed pharmaceuticals and cleaner environments. Other rising technologies are targeted towards improved sustainability of the earth’s resources and ecosystems, and projection of human life beyond the earth. As ever, earth-wide challenges, such as viral epidemic, and plans to combat and overcome them, rest on common purpose and shared values. Maybe these will not arrive in a humanly good-natured way, waiting on change enforced by conflict or natural disaster, as Houghton feared.

Good, coherent, openly-shared information systems already underpin many domains of science and engineering. Health care, lifestyle and lifespan-related information has been a significant laggard, notwithstanding massive expenditure, and policy peroration. It is unfortunate that information utility for these has become unduly entrained with commercial ambition and monopoly, leading away from integrative utility into fragmented silos of method, infrastructure and data. Fragmentation of health care services has become mirrored in fragmentation of health information systems. Health care system and care information utility must come to a new balance if the current malaise is to be resolved. The rising tide of information pandemic is a pivotal testing ground of opportunity to tackle the many imbalances it reflects, challenging future lifespan and lifestyle, globally. Can the Information Society both retain and build on the benefits of information technology, and combat the disbenefits that it also engenders, to win through to sustainable new balance?

In Chapter Five, I wrote of information engineering as practical discipline at the interface of science and society. Chapter Six explored the evolution of information as a scientific concept, illuminating understanding of living systems. In this chapter, the perspective has broadened to encompass information as knowledge with causal power to guide and support interventions and behaviours in matters of lifespan and lifestyle. I now embark on a rapid flight along my songline, charting the detailed historical coevolution of health care services with this changing information landscape. It passes over an amazing period of scientific and engineering advance and illustrates how policy and practice of health care services have reflected this. In this respect, it has been an era of excitement in achievement.

There is a parallel landscape reflecting how expectation and experience of health care services have changed, for all involved, and how their organization and governance have adapted. In this respect, it has been a more painfully anarchic era, reflecting and reflected in wider political and economic struggle and social change. In debates about health care information systems, opinions and criticisms are expressed from and aimed in all directions. There is speculative and often unproductive investment in the new. This populates and clutters the stage with many actors talking across one another. Some actors remain there too long, being still powerful and unwilling or unable either to adapt or to leave the stage. Some new ones, with messages worth hearing, find no space and are not heard. The issues faced are divisive. Their resolution digs deep into assumption, interest and belief, and, of course, into pockets as well. Attention to the 3x3 matrix of coherent future information utility–to achieve balance, continuity and governance, across personal, professional and population domains, aligned with purpose, policy and plan of feasible implementation–has been fragmentary. Left substantially and by default to market forces, it has been left adrift and behind.

Coevolution of Health Care with Information Technology

The information revolution has brought new methods to measurement, analysis, reasoning and action in clinical practice. It is has brought new understanding of the nature and scope of health care interventions, and their outcomes. Measurement devices and information systems feature ever more widely, from small, sometimes now wearable, devices, to systems that capture, represent and integrate data at all levels, from the local and personal to the public and global. Methods of control of devices, data analysis and communication operate over similar range and depth.

Health care services have co-evolved with information technologies for some sixty years. The software in use has been like a tapestry woven over time, with multiple threads drawn from across the public and private sectors. In some areas, the pattern woven has been chaotic and confusing, and in others, more coherent and useful. Radiotherapy and radioisotope methods yielded early gains in medical physics, which also pioneered medical imaging–another success story. Clinical laboratory method was also an early focus. Medical records and clinical decision making have long been a curate’s egg of successful and unsuccessful exemplars. Organization and management of service delivery became major concerns, and supporting IT systems came and went, with varying success. Primary care computing, with its stronger emphasis on local clinical autonomy, has perhaps been the greatest success story in the NHS, over time.23

It is in local community contexts and on common ground that a more coherent future information utility can now grow. The secondary and tertiary care domains are more deeply entrenched in silos of data. They will join in with the endeavour as the benefits of doing so grow, and the disbenefits of not doing so, evaporate over time. Policy and investment should reflect this shift of attention from Industrial Age to Information Age medicine.

In well-defined, practical and everyday contexts, machines are beginning to prove adept at learning rules for interpreting data and codifying knowledge, in useful ways. As with their learned strategies for playing the games of Chess and Go, they are increasingly adept in categorizing and interpreting complex patterns and images. That said, clinical histories are revealed, understood and told differently in different contexts, where they may have different meanings. They have social, economic, commercial and political contexts. The patterns they exhibit may be recognized by an experienced human, in ways they cannot fully articulate, as if their human mental process were an opaque machine-learning algorithm. They may become better understood through deliberative and experimental iterative processes of hypothesis, experiment and review, whereby science manages its endeavours; but these are wicked problems and there may never be objectively neutral observers of such experiments.

In all this, a gulf is growing between human and machine expertise in illuminating and tackling tasks central to health care. Giving the machine free rein may prove a Faustian bargain–we do not yet know, and opinions differ. I have drawn on three contrasting sources that have illuminated the complex scene for me, illustrating what is at stake. After their authors, I have named them the Birnbaum beatitude, the Weizenbaum warning and the Illich apocalypse (of iatrogenic disease–I like playing with words!). These are, respectively, descriptive, apprehensive and contrarian in nature.

Birnbaum looked forward, optimistically, to realization of benefit from what he called information appliances and information utilities. Weizenbaum looked on, concerned by the encroaching debasement of health care professionalism. Illich looked back, inveighing against what he saw as the harm done to society by Industrial Age medicine, and arguing for reversal of the progressive medicalization of life.

The Birnbaum Beatitudes

A useful overview of the evolution of information as a utility was given in a lecture I listened to at the Royal Society in London, in 1999. This was delivered by Joel Birnbaum, an engineer who led research and development activities for the then world-dominant Hewlett-Packard Corporation. I call them beatitudes because they presented as a rather harmonious and logical flow. That is not the way they were experienced, of course.

A graph entitled ‘Co-evolving hardware and computation - changing workflow over six decades’. The graph’s horizontal axis shows the decades between the 1960’s and 2010, the vertical axis is named ‘Pervasiveness of information systems’. The data on the graph logs a line that almost bisects the graph, and lists various forms of hardware and computation.

Fig. 7.3 The coevolution of information technology hardware and computation during the Information Age. Image created by David Ingram (2010), CC BY-NC.

Birnbaum described the rise of information technology over five eras, relating patterns of usage of computers with successive generations of infrastructure, during which information systems became a pervasive reality in everyday life (Figure 7.3). I used this chart in my talks of the time, as a template to depict the corresponding evolution of information systems in health care services.

In the 1960s, large mainframe computers predominated and giant companies in Europe and the USA–International Business Machines (IBM), Control Data Corporation (CDC), Universal Automatic Computer (UNIVAC), Honeywell, International Computers and Tabulators (ICT), Bull, Siemens–did battle. Such machines grew in power over two decades. In the UK, Elliott Automation and Ferranti, with their close connection, also, with military systems, built smaller scale machines and had early success in the world of industrial electronics. They were early pioneers of semiconductor technology, leading to integrated circuit and computer processor components, but were swept past in the marketplace by American giants that took these innovations to a much greater scale. Huge manufacturing plants were placed in poorer countries in the Far East. I visited one in Malaysia when asked to go on an assignment there for the Commonwealth Secretariat, to advise their government.

The large mainframes started by executing program tasks, one at a time, in batches. They progressed to sharing their capacity among multiple simultaneous tasks and users. The software needed to operate these machines was largely devoted to management of attached devices, such as card and tape readers, printers and disc storage devices, a scheduling system controlling program execution, using a variety of language compiler programs to convert the source programs into binary executable form and its final loading into memory, to be executed by the mainframe processor. Remote job entry was possible via subsidiary connected machines that were dedicated to storing and forwarding submitted jobs to the mainframe computer and receiving and printing the output files transmitted back to them.

The next stage was in introducing a time-sharing operating system, slicing the available shared resource among the needs of a variety of simultaneous jobs being edited, compiled and run from instructions typed in by several different users, sitting and connected online, at a teleprinter or visual display unit. This involved allocating a section of the available memory to each user and switching the processor resource among them according to an algorithm designed to smooth out demand over time, as well as keep control of the function of the other attached and shared devices. Over two decades, I worked with IBM 360 series, CDC 7600 series, ICT 1900 series and ICL 2900 series computers. It was Conway Berners-Lee (1921–2019), father of Tim Berners-Lee (the father of the Internet!), who came to our hospital, with Ted Coles, a future head of medical informatics in Cardiff University, as a salesman for the ICT 1900 series.

In the early 1970s, the exponential rise in processor power, combined with the progressive miniaturization of semiconductor devices, meant that medium size computers began to match previous generation mainframe performance, but in a more flexible and customizable configuration. They were still built from large, heat producing panels of electronics. The design was modular and allowed for extension by incorporation of custom-built electronics, to control new prototype devices.

These minicomputer machines had operating software that could likewise be configured more flexibly to acquire and digitize signals at variable rates, from external devices such as body scanners, and drive outputs to generate images on higher resolution display devices. They found application in the control of laboratory equipment and industrial plant. The operating software required was becoming more complex; the new machine ‘users’ exhibited different and often time-critical characteristics, which the controller had to adapt to in its sharing algorithm. This had to be able to stop work on one user program and switch almost instantly to deal with a time-critical event for another user program elsewhere, and then revert to the previous program, seamlessly. This brought evolution in both machine design and software architecture, the software necessarily coded in machine language, very closely coupled with the design of the computer processor and its peripheral devices. ‘Multitasking’ operating systems were designed to meet these new requirements and posed new challenges: machine hardware and operating software evolved synergistically, and sometimes software could not meet the challenge and new hardware was required.

This was a domain that required both physics and engineering expertise, to frame and connect application requirements and system design, to match with viable hardware and software. Applications involving instruments located at a distance from one another, such as in controlling electricity and gas utility supply networks, brought the need for a distributed computing network organizing communication among its nodes. Operating systems able to function across such a network, to control the signalling between and scheduling of activities, stably and sustainably, exercised systems programming further. Messaging systems and telephone and telecommunication systems more generally, evolved into the digital era.

Mainframes and minicomputers were first connected with remote users via analogue signals transmitted over telephone lines. The arrival of digital protocols and standards for telecommunication facilitated direct connection from computer to computer. The bandwidth of these connections grew rapidly, accelerated with fibre optic and microwave links. These computers were then joined in technically standardized networks and the software of operating systems extended within a new architecture enabling jobs to be run on and shared between multiple machines. This process accelerated with the arrival of the World Wide Web, as an architecture for connecting and distributing information resources within an unbounded network. Satellites in geostationary orbit became carriers of broadcast media and the one-way transfer of data. But latency of signal transmission, due to transmission over long distances, made them clunky as nodes in interactive networks. Terrestrial networks enabled single tasks to be shared among communities of users, as well as a single user’s task to be shared across multiple computers. Collaboration in the performance of a shared task and multiway communication within teams, became possible.

This was an era in which I studied all rival products in the marketplace, each company jostling for orders. I persuaded the Bart’s Medical College to allow me to purchase the earliest of the DEC PDP-11/45 computers, to install in the new clinical skills teaching laboratory for which I had led the joint medical and nursing college project team.24 I designed and procured the system and installed, configured and maintained the software, extending out to a college-wide set of clinical department users (the preclinical departments were on a separate campus at Charterhouse Square, half a mile away) through a cabling network. Great attention had to be paid to earthing between buildings, mitigating susceptibility to lightning strikes and David Lloyd (1940–2023) in the Medical Electronics Department helped hugely in all of this. It was very time-consuming work for a year or so until the College took the load from me, allowing me to appoint a dedicated team. It was a draining, full-on but highly educational time, all the same! It equipped me with shop floor experience that was very useful in subsequent roles in which I chaired the UCL Infrastructure Committee overseeing information systems, helped in the procurement and implementation of its new finance system and led the amalgamation of the IT support teams across all the specialist institutes and departments of the UCL Biomedicine Division. Also, in my national roles on oversight boards for eScience and IT infrastructure at the Central Computing Laboratory of the Research Councils (CCLRC), at Harwell, near Oxford.

After an event at the Royal Society of Medicine, where I spoke, I found myself invited by the Chief Executive of the time, John Green,25 to sit for the ensuing grand dinner at a table he hosted with the then Foreign Secretary, Peter Carrington (1919–2018). I was sat next to a hospitable, quite elderly, but very lively American woman, and opposite to her husband. At the time, I was quite involved with the work creating an electronic museum of tropical medicine for the Wellcome Trust and had recently become a Professor at Bart’s. I was quizzed about all this, and the subject turned to satellite networks and what they might offer. I asked her what her interest was, and she told me, in a matter-of-fact way, that she and her husband–sitting opposite, who beamed across at us–owned two satellites and were interested in whether they could contribute by connecting educational resources accessible throughout developing countries in Africa! She told me her name as we parted–Fleur Cowles (1908–2009). I mentioned this to Lesley Rees (1942–2022), my dean at the time, who knew everyone in London, it sometimes seemed. She expressed amazement, explaining that Fleur Cowles was a famous writer, best friends with the Queen Mother, and that she and her husband were icons of US/London social circles! I assume billionaires, too, by the sound of it. Nothing came of it, but it was fun to brush shoulders with ‘billionairedom’!

The first computer on a chip had come in 1974, from Intel, and gave birth to early microcomputers from Altair. Similar chips that failed to gain sway, came in England from Ferranti and Inmos. This technological advance heralded the advent of microcomputers–in time, these cost less than the earlier typewriter-like user input/output terminals and were able to process as powerfully and with as much memory as their mainframe and minicomputer predecessors.

Early in the 1980s came the Acorn BBC microcomputer, costing several hundred pounds, hugely configurable by its users for programming multiple applications, and educational in scope. It was designed and manufactured as part of a major BBC Literacy Project, commemorating the nine hundredth anniversary of the Domesday Book. In this, the BBC undertook a snapshot survey of the culture and times of national life, captured in images, descriptions and surveys, involving schools and all mastered onto interactive video disc and played from the BBC micro. It was this sort of profile that led me to spend a good deal of (ultimately wasted!) effort, in getting to grips with using the interactive video disc in medical education.

The Acorn machine was adopted in homes and schools across the country and provided a major fillip to the market, for companies writing new applications. Software became a key driver of standardization and CP/M and MS-DOS were central to this. Many other chip makers and microcomputer developers had joined in–Texas Instruments, Intel, Zilog, Advanced Micro Devices (AMD)… Atari, Commodore, Apple. Other players took the stage; computer games became major software products. Word-processing and office administration tasks became new markets for microcomputer-based machines and software.

The minicomputer manufacturers tried to catch up and maintain their upper hand. DEC tried its hand with a machine they called the Rainbow, but there was no gold to be found at its foot, sadly! They ported the PDP-11 minicomputer RSX operating system to run on a microcomputer they grandly called the DEC Professional, and I used one of these for a while to produce the graphical versions of the Mac Series of human physiology simulations. It was an uphill and unrewarding journey, other than to prove it could be done. The company lost out by not following closely enough the newly emerging operating system standards. They probably thought that their pre-eminence would continue to hold, but it did not, and they quite soon disappeared. IBM, which had tracked them through the minicomputer era, cleverly sustaining their markets for which mainframes were no longer technologically competitive, was again clever in falling in line with the de-facto MS-DOS operating system software standard. The IBM PC became the central focus of the expanding microcomputer marketplace. Office applications had expanded the markets of the minicomputer era and many dedicated microcomputer-based word processors achieved a significant market share. The standardizing clout of Microsoft prevailed, and office software became a main plank of its rapidly developing worldwide business.

The maturation of technology was, Birnbaum described, continuing to a point where the information network would become as invisible in everyday use as the networks of pipes delivering water to every home; turned on and off, heated, pumped, filtered, consumed and discarded. It would, he said, be like any common utility, most noticed when malfunctioning and otherwise not registered. In marked contrast to the early eras of computers, or cars for that matter, when the user or driver had, and needed, a high level of awareness of the inner workings of the machine and the requirement to tune and maintain it in everyday use. He believed that this information utility would transform the landscape of commerce as well, with far fewer companies focused on hardware and many more focused on applications delivering value for customers and consumers.

This reality advanced rapidly from over the horizon, as modular microcomputers were built into the ever-larger banks of processors, comprising computational and storage nodes of the Internet. These became the Grid of high-performance computing–the petaflop/petabyte computational platforms of the era of eScience that ensued in the 2000s, which I observed first-hand in many of the sciences, sitting on the national e-Science Board and the Scientific Advisory Board of the Council for the Central Laboratory of the Research Councils (CCLRC). They started to provide computational resources for large-scale science, supporting data capture and analysis for national laser, synchrotron and neutron source apparatus on the Harwell campus, processing power for data analysis of the Geneva Large Hadron Collider physics community, and coordination of the network of telescopes in use in coupled astronomical observatories located around the world.

The Grid described by Birnbaum has evolved into the Cloud infrastructure of today, hosted by Microsoft, Amazon, Google, Apple and Facebook (see Figure 7.4). The technology standardization paradigms across these communities have come from the World Wide Web Consortium (W3C). They extend the penetration of the Birnbaum-inspired slide of Figure 7.3, into the new and coming era of connected devices and the ‘Internet of Things’.

A presentation slide reading: GRID computing - towards the information utility / ‘the information utility will soon make possible a network of distributed electronic services built on open standards that will irrevocably alter most information-dependant industries… / We can expect the huge number of companies today offering essentially the same services to be reduced to just a few, while .. an even greater number.. built upon commodity-like platforms and interconnect middleware will spring into being.’ (Joel Birnbaum, Royal Society Lecture, April 1999).

Fig. 7.4 Anticipating Grid Computing, the Cloud, and information as a utility–from After the Internet, Royal Society lecture, Joel Birnbaum, 1999. Image created by David Ingram (2003), CC BY-NC.

A graph entitled ‘Co-evolving health care and informatics - changing focus over six decades’. The graph’s horizontal axis shows the decades between the 1960’s and 2010, the vertical axis is named ‘Pervasiveness of information systems’. The data on the graph logs a line that almost bisects the graph, and lists various forms of health care services and their supporting information systems.

Fig. 7.5 The co-evolving focus of health care services and their supporting information systems during the Information Age. Image created by David Ingram (2010), CC BY-NC.

After listening to the Birnbaum Royal Society lecture, I reflected on the timeline of evolution of computer applications and information systems within medicine and health care. And how health policy focus had correlated with the evolution of technology and systems in the five decades that he described (see Figure 7.5). I have extended this to a sixth decade, which has been aptly described by one of my eight great colleagues from different disciplines (who read and commented at length on the first full manuscript of this book) as being characterized by ‘[transition in the] functional hierarchy between the patient as an object of care, to the patient being an actor in care, to the patient managing their own condition with the help of clinicians […] the role of IT is critical and essential’.

Sometimes, ideas experimented with, and knowledge acquired, in each of these eras, persisted fruitfully into subsequent eras, but only when new software methods and tools had emerged to bring them to fruition. Sometimes, knowledge and experience gained was lost, as has been characteristic of the chaotic and explosively innovative Information Age, where focus on the new has buried much learning from past endeavours. It is interesting to reflect on how the tools and infrastructure now available would have alleviated a considerable amount of the legacy and burden of technological obsolescence that now persist in major IT infrastructure, systems and services.

Era 1: 1960s–1970s–Instrumentation

In the mid-1960s, there was a combination of large mainframe manufacturers, targeting contracts to provide general purpose computational capacity, and smaller scale machine manufacturers, who worked in specialist markets such as industrial automation. The larger were typified by IBM and UNIVAC, in the USA, and ICT, in the UK; the smaller by DEC and Data General, in the USA, and Elliott Automation, Plessey and Ferranti, in the UK. There were early partnerships between companies active in supplying computers and others focused on a particular area of application, such as the automation of laboratory chemistry tests, performed on samples taken from patients, and analyzed in hospital chemical pathology laboratories. The Technicon company led this field in the USA.

The larger companies had their eye on hospital-wide patient administration systems, to tame the paperwork that tracked and recorded information about inpatients and outpatients, from their hospital appointment or admission to their final discharge from care. Much of the impetus for this came from the companies themselves and they paired with willing and innovative clients within health care. The scale and complexity of such activity gradually became clearer, as systems analysts and programmers, who were brought in to work on the projects, struggled to specify, write and test software and implement systems. The teams of hospital staff that they worked with gradually became more aware of the scale of commitment such planning, design and implementation required from them, and the disruption it brought.

The process of formalizing requirements and brief for what computers were being purchased to do revealed a lack of clarity and consensus, as well as ambiguity and inconsistency in how the current services worked. Humans were used to patching and adapting, to circumvent these weaknesses. Computer programs were less forgiving, and their successive patches accumulated new vulnerabilities. Software entropy came into existence! Custom and practice–ways of working on and working around problems–were intertwined. Design choices made rested on clinical and management authority–on whose word counted and who was in charge.

From this boggy and buggy terrain, there arose a focus on operational research–formal mathematical methods for analyzing and guiding towards efficient organization of services. This invoked statistical models and methods of resource and cost allocation, workflow and queueing of throughput, and the like, used to evaluate alternative patterns of service. It became an area of interest for the already well-established medical physics and emerging bio-engineering communities, who saw the advent of the computer as a natural domain and professional opportunity for them to develop and improve the range of support services in which they were involved, and as an opportunity to expand their role into information engineering and hospital administration.

Pioneering initiatives emerged on many fronts. Traditional professional rivalries over territory and power extended into this new arena. Each group brought their problems to the table to justify their requirements and priorities. Patient experience was not yet a widely used term. Regarding computers and computerization, it was, inevitably, a country of the blind. I was based in medical physics at University College Hospital (UCH) at that time, and saw and participated in several projects in radiotherapy, medical imaging and intensive care. This was an era where each innovator had to start by purchasing the computer machinery and creating the wider technical infrastructure required to tackle development of the information system in their local health care context. There was no industry standard network protocol–Ethernet networks only slowly emerged and Open Systems Interconnection (OSI) and Transmission Control Protocol/Internet Protocol (TCP/IP) contested for what in the end became de facto rather than de jure dominance of transmission protocols. Connection was an electrical engineering task to pipe analogue signals along coaxial cables or twisted pair telephone lines. It was an engineering challenge requiring skill and persistence, and a lot of work, but did not have a lot to do with health care.

And the policy makers at local and national level became too involved in the machinery. Assuming that their size gave them power to mandate, they presumed to prescribe and implement change, expecting to bring order by dragooning the NHS to adopt service-wide messaging standards, such as Electronic Data Interchange for Administration, Commerce and Transport (EDIFACT). It was a chaotic world, and most innovators kept their heads down and sought local order, rather than lifting their eyes to see that order was needed much more widely, across domains over and beyond all health care institutions and services, before any of their health care initiatives had a chance to gain traction and scale. This approach is the tail wagging the dog. It persists to this day with machine and management imperative invading into health care imperative. It is an approach of medium becoming message. But clinical communications are about meaning, not message of machine protocol.

These ambivalences and struggles became harbingers of chaotic cultural and professional change in health care services. Leadership through such times was challenging. The challenge was to learn how to use IT effectively by making and doing things experimentally, and to build discipline and capability to extend incrementally. The nature of the experiment was misunderstood; it was an experiment in the domain of understanding the nature of health care and ways of doing things, not one of using well-defined and suitably configurable computer methods and tools, to perform well-understood and suitably adaptable tasks. Neither of these premises were true. The transition embarked on was not specifiable within known scope, scale, capability and cost of what it would take to realize the hubristic ambitions assumed to be achievable, from helicopters on high.

Many bucked the challenge and others rose to it and were heroic, showing immense humanity, depth, and resilience. It turned into three more decades of work and has still not achieved its goals beyond local and specialized scope and scale. Many who took on the baton of leadership must have regretted doing so–discovering, mid-storm, that there was no end in sight, losing energy, and giving up. Others soldiered on through a succession of groundhog days!

How the task was led and tackled, given its inherent nature as a wicked problem, was as much about character and style of leadership, as it was about what was sought to be made and done. As Fred Brooks would have known and said from the start, this kind of task needed good and trusted architects, able to combine leadership with ability to work with and integrate the needs and perspectives of different sections of the organization to be served. In Chapter Eight, I describe some great pioneers, who I knew and worked with, who rose to that challenge.

Here are two initiatives–one wholly unsuccessful, the other successful in its time.

Example–The King’s Hospital Project

The King’s College Hospital project in the early 1970s was the first attempt to computerize patient records for the NHS.The project, under the auspices of a brave and innovative clinician, John Anderson (1921–2002), then the Professor of Medicine at King’s College London, was awarded a grant by the NHS to computerize medical records–though not by profession an architect. The project was funded and expected to meet what were likely, at the outset, to have been considered well-understood clinical requirements, using well-established computer technology. It was commissioned by the then NHS Supplies Division. In that era there was almost nothing by way of digital imaging, computer networks or even standardized database methods. A five-megabyte disc cartridge was a bulky item.

The project purchased a batch processing ICT mainframe for the purpose and spent most of the money, paddling hard under the surface, swan like (and inevitably, in time, mirroring the dying swan of Swan Lake), to rewrite the operating system so that it would allow time-sharing on several terminals. In terms of clinical objective, the technology was a total mismatch to the imagined task at hand. The project faltered and the short Lancet article burying it was titled, loftily, and with echoes of the still remembered demise of Edward the Sixth, ‘The Kings Failure’!

Example–The London Hospital Project

A project focused on hospital patient administration commenced at the London Hospital. The clinical leadership again came from a Professor of Medicine, this time from Robert Cohen (1933–2014), an endocrinologist and a good colleague and friend of John Dickinson, at Bart’s. The London Hospital administration was led at the time by Michael Fairey, who went on to lead the NHS national programme and take a seat on the new NHS National Executive, creating a new directorate of information strategy and management. The head of finance, Budd Abbott, a canny street fighter in the politics of NHS organizations, championed it alongside. Theo Brueton was the IT lead who built a large team and computer centre for the project. Barry Barber (1933–2019), a physicist with interest in operational research and confidentiality of health data, provided those areas of expertise.

A dedicated building was created for the computer centre and a monster mainframe, with huge spinning UNIVAC drum data store, ensured the most rapid possible access. This project did indeed computerize the administration of patient flow through clinics, wards, theatres, laboratories, imaging. It captured and communicated data throughout. It stayed clear of the medical record, adding printed copy from the computer to the medical record as the carrier of the ‘who did what, when, how and why, with what result’, at the heart of the hospital operation and mission. In this era, medical mission had retreated behind the doors of hyper-specialism. General medicine and general physicians divided into twenty or more specialties.

Barber, B., R. D. Cohen and M. Scholes, ‘A Review of the London Hospital Computer Project’, Medical Informatics, 1.1 (1976), 61–72.

Over the following years, others followed a similar pioneering route at hospital level (for example, the redoubtable Howard Bleich (1934–2021), at the Beth Israel Hospital in Boston). Several large mainframe manufacturers attempted to commercialize these patient administration systems (for example IBM with their Patient Care system). Along with other major players in IT, they saw potential and pitched into the flow. Encountering rapids downstream, they mostly bailed out! IT interests earned and lost huge sums in those times; health care was often a loser, too.

Change towards the new era of minicomputer technology made possible a different kind of project, that was owned and operated at departmental level, experimented with and deployed as a component of its professional services and activities. Early into this arena were hospital physics departments, and the radiation physics and imaging services they supported. Alongside, were the laboratory scientists and pathologists. Pioneering clinicians led the development of record-keeping systems for specialist departments. The focus was very much exploratory, discovering how computers might prove useful in a practical clinical context. One such in London at that time was the nephrologist Hugh de Wardener (1915–2013), a professor of medicine at the Charing Cross Hospital in London, who worked with Mike Gordon on his impressive Clinic 1 system. Mike was a close colleague of our team at UCH in the early 1970s.

Preeminent in the era was the physics department of the Royal Marsden Hospital in London. Jo Milan (1942–2018), working with Roy Bentley (1930–2017), developed there the first computerized radiotherapy treatment plans, using one of the first widely used minicomputers produced by the Digital Equipment Corporation. This was the PDP-8, and the system was called Rad-8. Jo became a master of getting the most from the 4k banks of memory and 32k quanta of disc storage, programmed in the machine and assembler languages of the era. In subsequent decades, Jo also created and led the nationally renowned information infrastructure for tertiary cancer care at the Marsden. This was, by a wide margin, an extremely impressive outlier in terms of clinical acceptance, quality and value for money achieved, as reported in the 2000 national survey of the impact of IT within all ninety-three NHS Trusts. Jo’s massive contribution as architect of this era is celebrated among my stories of pioneers, in Chapter Eight, where I focus on clinical information architecture and attempts towards its standardization. It was a superb example of engineering excellence, health care focus and dogged determination, with huge local success–sadly not well understood, valued and duly recognized by the hospital management in the very place in which it grew and prospered, over four decades. Jo was a uniquely talented and committed, wonderful friend.

The database software market became lucrative and competitive and keeping up was tough. Products came and went. Ingres blossomed and went away; Oracle stayed the course and reaped rich rewards. As discussed in Chapter Five, capacity and performance requirements became more demanding as databases spread more widely into industry and commerce. The challenge for programmers centred less on accommodating the limitations of devices and more on meeting the requirements of program applications, that required the rigorous implementation of new and evolving types and extents of data and proved tricky, bordering on impossible, to achieve satisfactorily with the technology of the times.

These requirements were well exemplified in clinical records and the pioneering work of Octo Barnett (1930–2020), Neil Pappalardo and Howard Bleich, in Boston at the Massachusetts General Hospital (MGH) and Beth Israel Hospitals. These teams developed the Massachusetts General Hospital Utility Multi-Programming System (MUMPS) and MEDITECH Interpretive Information  (MIIS) systems with this sort of domain as their focus. The MUMPS global variable concept brought the database down into the software domain, supported by a much simpler software interface to a balanced-tree representation of the data on the disc backing store. Pappalardo founded Meditech and went on, by 2018, to become a demi-billionaire. Barnett’s MUMPS legacy persists at the heart of hugely profitable health IT businesses today. I am not sure whether he died wealthy–probably not, and maybe that was not a capability or priority for him–but he did make a huge difference. I celebrate his pioneering contribution in Chapter Eight. Making a dollar and making a difference are two different things–it is the lucky who manage to combine the two!

MUMPS-based systems proved extremely flexible and powerful in clinical contexts and became a mainstay of implementations around the world in coming decades, with many highly successful devotees. As in Jo Milan’s systems at the Royal Marsden Hospital, MUMPS language programs were later combined with relational databases, to enjoy the growing power, flexibility and operational rigour these could provide. And, in later times, in the Internet era, non-relational data models emerged, to cope with new requirements posed in accommodating much larger aggregations of less structured data.

The increasing power and flexibility of the minicomputer enabled rapid progress in medical imaging. Pioneers like my colleagues of the time, Christopher Taylor at Manchester, Andrew Todd-Pokropek in London and Stephen Pizer at the University of North Carolina (UNC) Chapel Hill combined mathematics, physics and computer science backgrounds in capturing and digitizing images from microscopes and X-ray and radioisotope-based cameras and scanners. They developed algorithms to enhance and analyze the image, represented as a two-dimensional matrix of digitized elements (pixels), and methods to support and study their clinical interpretation. These methods extended to three-dimensional images, as body scanning technology advanced and to a four-dimensional manifold in time series, to study body function over time. New contrast media enabled information on organ function to be captured and analyzed, giving further insight for the diagnosis and management of dysfunction. Imaging methods also supported treatment, for example enabling more precise targeting of tumours for radiotherapy. Such methods spread throughout the newer technologies of ultrasound and nuclear magnetic resonance imaging, and imaging methods of physical and life science, today.

Godfrey Hounsfield (1919–2004) at Electric and Musical Industries (EMI) in Teddington took his first steps in creating and commercializing computerized axial tomography. In this, a succession of scans of X-ray transmission, taken at different angles in a three hundred and sixty-degree sweep around the body, was used to model the absorption characteristics represented by a grid of cells within the body, dividing up the cross section imaged. The reconstruction of the image from this set of scans was called an Algebraic Reconstruction Technique (ART). It was a mathematical algorithm and its encroachment into the scientific domain was questioned in a rather pompous commentary entitled ‘Is ART Science?’!

Minicomputer manufacturers such as DEC and Varian developed bespoke systems for applications in radiotherapy treatment planning and nuclear medicine. In time, the computer became integral to the design and operation of the imaging or treatment device, and these device manufacturers resumed their central place in the market for such systems.

Era 2: 1970s–1980s–Medicine

The growing range of pioneering initiatives in Era 1 engendered wider awareness of the future potential and significance of the unfolding domain of information technology. This spread from science and engineering communities into public and private sector organizations, and national and international policies. In medicine, where did this all fit within traditional ways of working, and professional roles and responsibilities? In the running of hospitals and other organizations, what new roles and skills were required and what changes were needed?

With the increasing range and complexity of services, costs rose. And whereas clinicians and institutions had traditionally worked with some autonomy and limited governmental oversight, often drawing on the perspectives of the great and the good among insiders, there was increased pressure for wider scrutiny and overview at local, regional and national levels. NHS services were reorganized, grouping institutions within districts, areas, regions and national centres. Reorganization became a watchword of the day, the perceived way to tackle the growing and developmental pains of health care in the Information Age.

A new and sharper profession of health care management started to supersede the more gentlemanly world of hospital administration–of course gentlemen were men, but not always gentle! Culture wars ensued. It would seem natural for clinicians who know their profession to progress into health care management. They carefully guard their own citadels. However, managerial professionalism established its own citadels, and non-medical health care managers nailed their colours there, and defended them, too!

I was working at the time in daily contact with the senior staff at the centre of Bart’s, a venerable, both loved and disliked, NHS hospital. It was an unusual one, in that it combined eminent and landmark personalities and ways, with a modern and forward-looking outlook. It had immutable City of London links to money and influence. The Lord Mayor came on a ceremonial visit each year. Thus, the many dimensions of change being played out between medicine and nursing, between clinicians and managers, scientists and clinicians, and between local and district/regional/national politics, influence and power, were present in its life and community. It was a theatre holding a mirror up to health care of those times–it certainly attracted some theatrical personalities onto its staff! Battles over computers bubbled up in academic and clinical service departments, and between hospital and university, over the ownership of the domain. Where should investments be made, by whom and according to whose plans?

In medical education, traditional curriculum wars intensified, over whose priorities would win out, in time and resource devoted to teaching students, and the money that went with it. Computer-assisted learning came into being. Within the professions, almost every specialist Royal College and health care body established a computer group to consult, research and advise policy on these matters. The Royal College of Physicians, prompted by the gastroenterologist Wilfrid Ingram Card (1908–85), established its Computing Committee. This focused its activities on his particular interest in formalizing theory and practice of diagnosis (a discipline that was understood rather differently by different researchers and practitioners across the domain, it emerged), and on exploring statistical methods for analyzing decision making.

As introduced in Chapter Four and rehearsed again, here, for completeness of the chapter, this group became a meeting place for leading figures in medical informatics of the era. Card, who had teamed up with Dennis Lindley (1923–2013) at UCL, proposed a statistically grounded formalism as a theory of medical decision making. He was succeeded by Robin Knill-Jones, who teamed up with the oncoming greatness of David Spiegelhalter, with whom he collaborated on Bayesian methods for diagnosis of acute abdominal pain. Another luminary figure of the era also shone there. This was Timothy de Dombal, who master-minded a simpler Bayesian analysis, bypassing the subtleties of conditional probability distributions, to analyze the same problem area. His work focused on trials extending over many countries and cross-fertilizing with other problem domains.

In the USA, pioneering work, such as that of Edward Shortliffe on the MYCIN system for diagnosing and treating infectious disease, followed a paradigm of rules-based reasoning, building on the LISP and Dendral era at MIT in the late 1950s, as discussed in Chapter Two.

Both statistical and rules-based paradigms for decision making confronted issues of variance and uncertainty. How far was this associated with natural biological variation, or adequacy of measurement and observation, or adequacy of the conceptual models used to describe and reason about the domain, or other chance factors? How much did clinical context, expertise and experience count? How did clinicians themselves reason in these matters? How much did they differ, and why?

The human-machine interface became a focus of interest, in terms of both the ergonomics of practical methods employed and the psychology of perception and cognition. There was an explosion of research interest in matters of human judgement, more widely, leading to new thinking about clinical skills and their assessment, in formal education and in regulation of professional practice. These matters became of increasing professional concern and wider ethical, regulatory and legal significance, in this and following eras. Human grappling with the performance of tasks by machines led to deeper questioning of previous human understanding and performance of these same tasks–not an uncommon pattern of events as we set out to computerize. These became national preoccupations touched along my songline of the following decades, positioned as I was close to the heartbeat of the associated professional and academic communities.

In medicine, this era saw a major focus on management data, centred in the NHS on the landmark investigation and 1982 report led by Edith Körner (1921–2000), which I discuss in more detail in the section below on fifty years of policy review in connection with coevolution of health care services and IT. This set out what was required for the organization and management of clinical services for a typical community of 250,000 citizens, and their wider oversight and accountability. The separation of concerns of clinical management, responsible for looking after patients, and health care management, responsible for smooth and efficiently integrated services, was notable at this stage. And nowhere better characterized than by Douglas Black (1913–2002), President of the Royal College of Physicians at the time, in a wise leading article reflecting on the report, published in the British Medical Journal. He emphasized the important contribution of management information and distinguished it from what he saw as a neglected balance with the information requirements of good patient care. He endorsed a similar critique of the report by the Kings Fund, an institution dedicated to health policy, with a mission in common with the Nuffield Trust and American Commonwealth Fund. The article concluded thus, first quoting the report that: ‘Information technology is only exploited to the full when developments are information led, so that the information requirements must be identified first and only then a choice made from the wide range of technology available’. To which he adds ‘The point could perhaps be made more simply–“Don’t choose a computer until you know what you want to do with it”’.26

More colloquially, perhaps, the common refrain of our age: ‘To err is human, to really mess things up, buy a computer’! This carefully considered article is well worth reading forty years on, from one of the most respected and insightful clinical leaders of his generation.

Era 3: 1980s–1990s–Health Care

Hospitals and specialisms very much cornered the playing field in Era 2. In professional terms, general practice was still very much a poor relation. Academic departments and professors of medicine were not well established in universities in my early years, and it was not until the second half of Era 2 that professors of general practice appeared in numbers on the scene. The differentiation of primary and secondary care was long established, but the expansion and policy influence of primary care rose significantly in the oncoming Information Age, as lifespan increased, lifestyles changed and more was known about, and could increasingly be managed, near to home.

A new culture war intensified over money, power and influence. This played out around Bart’s, wedded to the wealthy culture of the City of London and connection with private medicine in Harley Street, but located adjacent to the poorest communities of London’s East End. General practice there drew motivated and radical pioneers, some with loud and quarrelsome voices, spoiling for a fight to put things right! General practice patient records became a major preoccupation throughout the country. Many general practitioner (GP) pioneers turned their attention and found a new mission in efforts to computerize these. General practice is a huge domain and commercial activity that has sprung up in many places.

Maturing microcomputers and computer networks gave a new context to these efforts, as costs of purchasing and running systems for practice management fell, and network connection to nationally provided information management services became much easier. Commercially developed systems started to emerge–at one time there were over twenty competing suppliers of practice management systems, all operating on a very similar landscape of clinical practice and data, but with little meaningful connection of data and information models in their systems. National accreditation of systems began to apply a regulatory rudder to their development, to promote convergence. I observed this process playing out in evolution of the ParaDoc system, pioneered in the East End system by my close colleagues, Sam Heard and Dipak Kalra, which I describe further in the next chapter.

The attrition imposed by a continuous need to update systems, as experience in their use and changing service requirements evolved, and the mandate introduced that they should fit within a nationally determined set of requirements for accreditation, meant that only the fittest survived. This pattern continued over the coming two decades, with only a handful of survivors and some, by then, rather wealthy company owners, some of whom put their wealth to work, in establishing new centres of research and innovation in the field. Some distinguished careers to this day established their credentials in those times.

Era 4: 1990s–2000s–Health Systems

The rise of network telecommunication protocols and the pioneering work on a networked information system for the Conseil Européen pour la Recherche Nucléaire (CERN) laboratory in Geneva–the foundations of the Internet and World Wide Web–heralded a new era in the coevolution of health care services and information technology. In earlier pioneering times, there was slow recognition of the need for agreement about and standardization of the requirements to be met by information systems, common ways of defining and describing them, and the roles and tasks they were to serve.

As described in Chapter Two, the College of American Pathologists published a Standard Nomenclature of Pathology. This was the seed of subsequent evolution and international alignment of the SNOMED nomenclature for medicine, subsequently carried forward by the International Health Standards and Terminology Organization (IHTSDO), later renamed as SNOMED International. In the world of librarianship, classification of medical literature led to the establishment of Medical Subject Headings (MeSH) and a Unified Medical Language System (UMLS) was proposed as a language of medicine. The International Classification of Primary Care (ICPC) came and went and the LOINC system for descriptions of laboratory measurements and observations, created and maintained by the Regenstrief Institute in the USA, achieved, and has sustained, worldwide impact in parallel with SNOMED. The WHO had long championed the widely used International Classification of Disease (ICD). Health Level Seven (HL7) emerged in the USA as the mainstream of industry-led standardization of health care IT systems.

The pioneering terminology initiatives started as multiaxial systems, mirroring library classifications of earlier eras. The challenge of refactoring their earlier versions within emergent new disciplines of description logic, also given the scale of sunk cost and legacy content they represented, was daunting. SNOMED has embraced the transition, but ICD has not yet achieved this. A further Generalized Architecture for Language Encyclopaedias and Nomenclature in Medicine (GALEN) project will, no doubt, sometime prove necessary.

The primary motivation for investment in these resources has been the standardization of secondary use of data needed for management of services, population overview and epidemiological research. Where these terminologies fit in underpinning the quality and effectiveness of direct patient care and its records of care, remains contentious. Everyone wants good care, data and records. No one wants the excessive cost and burden of data collection and management that they still currently impose. If there is to be information utility for health, this current transitional burden must somehow recede into the invisible background. This will not arise through managerial fiat. It will require the synthesis of clinical and engineering skills, in a human context. These will be needed to counterbalance the needs of the machine, lest they pull services too far under the control of mechanistic, protocol-driven and money-focused systems.

The impetus for progressive standardization of health systems has been twofold. The first, to help in creating rigorous and sustainable disciplines for the specification and design of interoperable systems. In earlier years, much of this was experimental and pragmatic. It was also untidy and vulnerable to error, inconsistency and inflexibility: in the framing, storage and retrieval of the information contained, in the correction of errors and weaknesses exposed, and in adaptation to changing requirements. This spoke to wider issues of fitness for purpose and safety, and to the burden imposed on users by a legacy of still operational but unwieldy and slowly declining systems. The second was to formalize and regulate marketplaces, making sure that apples were not pretending to be oranges, and bad apples and oranges were not easily confused with good ones. This topic is carried forward in the next chapter.

Era 5: 2000s–2010s–e-Health

The scientific watershed discovery of the double-helix structure of DNA, dates from Francis Crick (1916–2004) and James Watson, at Cambridge in 1953, and the discovery of DNA itself, by the Swiss chemist Friedrich Miescher (1844–95), nearly a hundred years before.27 The race to sequence the molecule, pitting the wealth and scientific mission of major funders, such as the Wellcome Trust, with the skill and entrepreneurial drive of the American Craig Venter, took place over the years overlapping between Eras 3 and 4. I knew and worked with the Wellcome leadership and project team that co-funded the sequencing of DNA and the establishment of the Sanger Institute at Cambridge. Fred Sanger’s two Nobel Prizes attested to his scientific contribution to methods for unravelling the science of life within every cell.

From this work, intimately connected with Cambridge of that era and since, came foundations of the fifth era of coevolution of information and health care–that of genomics, personalized medicine and e-Health. It emerged alongside changing roles and relationships of patients and professionals, as the scope and scale of primary care and home and community-centred interventions increased, as did the depth and range of information accessible through the Internet. The continuing failure to coordinate health and social care, within their common geographical context but different funding and governance frameworks, became ever more troublesome.

My songline passed alongside great scientific pioneers of those times, such as the physicist Janet Thornton, who worked on protein folding at UCL and went on to establish and lead the European Bioinformatics Institute at Cambridge. The rise of bioinformatics, capitalizing on much cheaper and faster means for tracking genetic markers and sequencing whole genomes, started to transform the focus of life science, throughout its molecular, cellular, organ, whole body and population perspectives. What took years to accomplish, in sequencing the human genome, at a cost of a billion dollars, in the 1990s, led to it taking just days to sequence the SARS virus, twenty years later. New sequencing technology of the past twenty years now reaches towards whole genomes being sequenced in minutes, at a cost of hundreds of dollars. These achieve sequencing speeds one hundred thousand times faster than the earlier methods. Now we have the One Hundred Thousand Genomes Project and Biobank initiatives, tracking the genetic context of population health and guiding interventions and service designs in previously unachievable ways. This will be a central building block of the architecture of future information utility for health care. In a clinical context, it will require a corresponding standardization of the phenotype of care, captured in its records. That is what openEHR has focused on growing, as described in Chapter Eight and a Half. My songline has also passed alongside major European Union (EU) research consortia, such as those led by Norbert Graf, on infrastructure for cancer genomics clinical trials, where I was an external reviewer and then an advisory board member for nearly a decade. I celebrate Norbert’s pioneering work in Chapter Eight.

Era 6: 2010s–2020s–e-Commons

The present stage of coevolution of health and information technology into a sixth era, from the 2010s–20s, and extending beyond Birnbaum’s original timeline, is of Cloud-based technologies hosting applications software and information services and bringing new opportunities for collaborative work anchored in the e-Commons. I will focus on this in the coming chapters of Part Three. The Cloud now links from the largest to the smallest in the world of the Internet of Things.

The Weizenbaum Warnings

In 1976, Joseph Weizenbaum (1923–2008), an MIT computer scientist, published his landmark book, Computer Power and Human Reason: From Judgement to Calculation28 It is a fitting counterpoint to the tone of the previous section, set out in successive eras along the Birnbaum timeline. The book contained informed and sombre foreboding about the detriment he feared might impact on human society in the Information Age. It was republished, with a further reinforced sense of peril, in 1984, the timing adding an Orwellian overtone to the warning. The writing of the book was a personal odyssey and he was in good company, acknowledging the support and advice he had received along the way from colleagues, including some who were, or became, luminary figures of the times–Noam Chomsky, known as the father of modern linguistics and founder of cognitive science; Lewis Mumford (1895–1990), philosopher and historian of technology and city life; and Daniel Dennett, cognitive scientist and philosopher of biology, science and mind.

Weizenbaum was the creator of the ELIZA program, which ran on clattering, 1970s, teleprinter hardware and timesharing system software, conducting a conversation with its user. It was a simple box of tricks, asking general questions and using a lexical scan of the program user’s responses, to frame and pose more specific follow-up questions. In this way, guided entirely by the terms and phrases adopted by the user and some simple heuristics of grammar, ELIZA conducted a ‘conversation’. It was following, slavishly, in the conversation, but gave the user the impression that it was leading, authoritatively.

In Weizenbaum’s eyes, the users testing the program responded to ELIZA and were quickly bamboozled and hooked into a trusting and open counselling session. Knowing the simple program and observing the engagement of its users, he was concerned. Quite apart from whether ELIZA would have passed a more realistic and rigorous Turing test of intelligence, which it certainly would not have, what did ELIZA say about computer power and human reason? His book, written amidst the rise of what he saw as a harmful obsession of children with computer games, reflected a profound cultural pessimism. The current debate about ChatGPT, producing plausible college student essays and news articles to order, drawing on billions of sources across the Internet, echoes the ELIZA dilemma. This software seems closer than ELIZA to Turing test accreditation, albeit perhaps not yet to an alpha grade or Pulitzer Prize!

Weizenbaum has a telling paragraph about the nature and role of tools, for those who use them and those who use their products. I quote at length as to paraphrase would do an injustice:

Tools and machines do not merely signify man’s imaginativeness and his creative reach, and they are certainly not important merely as instruments for the transformation of a malleable earth; they are pregnant symbols in themselves […] An oar is a tool for rowing, and it represents the skill of rowing in its whole complexity. No one who has not rowed can see an oar as truly an oar. The way someone who has never played one sees the violin is simply not the same, by very far, as the way a violinist sees it. The tool is also a model for its own reproduction and a script for the re-enactment of the skill it symbolises. That is the sense in which it is a pedagogic instrument, a vehicle for instructing men in other times and places in culturally acquired modes of thought and action. The tool as symbol in all these respects thus transcends its role as a practical means toward certain ends: it is a constituent of man’s symbolic recreation of his world. It must therefore inevitably enter into the imaginative calculus that constantly constructs his world. In that sense, then, the tool is much more than a mere device: it is an agent for change. It is even more than a fragment of a blueprint of a world determined for man and bequeathed to him by his forebears–although it is that, too.29

His eighth chapter is devoted to artificial intelligence, on which he writes:

I had once hoped that it would be possible to prove that there is a limit, an upper bound, on the intelligence machines could achieve, just as Claude Shannon, the founder of modern information theory, proved that there is an upper bound on the amount of information a given information channel can transmit […] It is now clear to me that, since we can speak of intelligence only in specific domains of thought and action, and since these domains are themselves not measurable, we can have no Shannon like measure of intelligence and therefore no theorem of the kind I had hoped for. In plain words: we may express the wish even the opinion that there is a limit to the intelligence machines can attain, but we have no way of giving it precise meaning and certainly no way of proving it.30

The New Scientist that dropped through our letter box this morning, as I was writing, has the cover title: ‘Have We Got Intelligence All Wrong?’

Weizenbaum poses a question–‘What human objectives and purposes may not be appropriately delegated to computers?’–and responds, saying: ‘The question is not whether such a thing can be done, but whether it is appropriate to delegate this hitherto human function to a machine’.31 This was and remains a contentious matter, today even more so. His ninth chapter discusses the danger of incomprehensible programs, and records lively debates at MIT, pitting dismissal of what were perceived as irrelevant philosophical musings against contrary fears that rationality in human affairs would come to be equated with computability and logicality. The Weizenbaum plea is for human reason to be associated with more than application of science and technology, but for these to be placed within clear context of human dignity, authenticity, self-esteem and individual autonomy.32

In the 1984-edition of the book, a re-enlivened Weizenbaum added a tenth chapter, which would be a fitting set text to be critiqued by students of courses on artificial intelligence, today. It addresses the imperialism of instrumental reasoning. I highlighted some of this philosophical debate in the context of theory of knowledge in Chapter Two. Referring to the promise and power of science and technology, he highlights the importance of power to choose, saying on: ‘Power is nothing if it is not the power to choose. Instrumental reason can make decisions, but there is all the difference between deciding and choosing’.33

He goes on to discuss the genome sequencing race of the era and biologists’ concerns about ethical science and practice. Two major questions arise:

There simply is a responsibility–it cannot be wished away–to decide which problems are more important or interesting or whatever then others. Every specific society must constantly find ways to meet that responsibility. The question here is how in an open society, these ways are to be found; are they to be dictated by, say, the military establishment, or they are they to be open to debate among citizens and scientists? If they are to be debated, then why are ethics to be excluded from the discussion? And finally, how can anything sensible emerge unless all first agree that contrary to what John von Neumann asserted, technological possibilities are not irresistible to man? ‘Can’ does not imply ‘ought’. […] A central question of knowledge, once won, is its validation; but what we now see in almost all fields, especially in the branches of computer science we have been discussing, is that the validation of scientific knowledge has been reduced to the display of technological wonders.34

In some of his argumentation, he would probably now, retrospectively, recognize that he was proved somewhat off-beam. Regarding speech and natural language translation understanding, he writes:

Yet we have to remember the problem is so enormous that only the largest possible computers would ever be able to manage it. In other words, even if the desired system was successfully designed, it would probably require a computer so large and therefore so expensive that only the largest and best-endowed hospitals could possibly afford it–but in fact the whole system might be so prohibitively expensive that even they could not afford it. The question then becomes, is this really what medicine needs most of at this time? Would not the talent, not to mention the money and the resources it represents, be better spent on projects that attack more urgent and more fundamental problems of health care?35

Without wishing to disavow his conclusion, he clearly could not have imagined a world fifty years later, with the mainframe of his time, collapsed into the smartphone of today.

The Illich Apocalypse–Iatrogenic Disease

Health and care are deeply personal matters, and human rights and responsibilities are divisive matters in politics. The writings of Ivan Illich provide interesting insight into the extremes of these perennial arguments. One hesitates to call them alt-left wing views as they contain much that might equally be thought of as alt-right perspective of our era, both decidedly authoritarian. They rather confirm the historian Norman Davies’s view, that the extreme left and right of the continuum of political perspective are close neighbours at the respective ends of a horseshoe shape. A horseshoe-shaped magnet has strong magnetic field near these north and south poles, but becomes rather weaker towards the centre. Impressionable acolytes are attracted there, but it is a tossup as to which tendency they will adhere to, or maybe they will team up with both!

Illich was born in Vienna, studied theology and philosophy in Rome, and obtained a PhD in history in Salzburg. He moved to New York and was an assistant pastor in an Irish-Puerto Rico parish until 1956, when he moved to become vice-rector of the Catholic University of Puerto Rico, establishing there the widely known and controversial Centre of Intercultural Documentation. He directed seminars on ‘Institutional Alternatives in a Technological Society’ with focus on Latin-America. Among his well-known radical cris de coeur are Deschooling Society (1971), and Limits to Medicine: Medical Nemesis: The Expropriation of Health (1975).36

In Medical Nemesis, he pitches in, in his characteristic apocalyptic style:

The medical establishment has become a major threat to health. Dependence on professional health care affects all social relations. In rich countries, medical colonization has reached sickening proportions; poor countries are quickly following suit. This process, which I shall call the ‘medicalization of life’, deserves articulate political recognition. Medicine is about to become a prime target for political action that aims at an inversion of industrial society. Only people who have recovered the ability for mutual self-care by the application of contemporary technology will be ready to limit the industrial mode of production in other major areas as well.37

Information technology as the saviour of society from medical nemesis–interesting how the poles are reversed from Weizenbaum’s perspective that therein will arise a deskilled and dehumanized world of health and social care! He signs off towards the end of the book, in similar form:

Medical nemesis is the experience of people who are largely deprived of any autonomous ability to cope with nature, neighbour and dreams, and who are technically maintained within environmental, social and symbolic systems. Medical nemesis cannot be measured, but its experience can be shared. The intensity with which it is experienced will depend on the independence, vitality and relatedness of each individual. 38

In talking this way, and using the term ‘symbolic systems’, I imagine he was referring to human culture imposed by machines and reflecting an industrial model of medicine. In his analysis, the medicalization of health starts with the language underpinning mechanistic concepts of disease:

The acute problems of manpower, money, access and control which beset hospitals everywhere can be interpreted as symptoms of a new crisis in the concept of disease. This is a true crisis because it admits of two opposing solutions, both of which make present hospitals obsolete. The first solution is a further sickening medicalization of health care, expanding still further the control of the medical profession over healthy people. The second is a critical, scientifically sound, medicalization of the concept of disease.39

As befits his political stance more widely, he attributes the failure to recognize iatrogenic disease to professional reluctance to give back the status and power it has acquired:

Just as Galileo’s contemporaries refused to look through the telescope at Jupiter’s moons because they feared that their heliocentric worldview would be shaken, so our contemporaries refuse to face nemesis because they feel incapable of putting the autonomous rather than the industrial mode of production at the centre of their socio-political constructs.40

His prescription for new legislation governing health is focused on promoting personal autonomy:

[…] the debate (on health care systems) could be rescued if attention were focused on medical nemesis, if recuperation of personal responsibility for health care were made the central issue, and if limitations on professional monopolies were made the major goal of limiting legislation. Instead of limiting the resources of doctors and of the institutions that employ them, such legislation would proscribe medical technology to professionals until those devices and means that can be handled by laymen are truly available to anyone wanting access to them. Instead of multiplying the specialists who can grant any one of the variety of sick roles to people who are made ill by their work and their life, the new legislation would guarantee the right of people to drop out and to organize for a less destructive way of life, in which they would have more control over their environment. Instead of restricting access to addictive, dangerous or useless drugs and procedures, such legislation would shift the full burden of their responsible use to the sick man and his next of kin. Instead of submitting the physical and mental integrity of citizens to more and more wardens, such legislation would recognize each man’s right to define his own health–subject only to limitations imposed by respect for his neighbour’s rights. Instead of relying on professional expertise to verify such values that will guide them. Instead of strengthening the licencing power of specialized peers and government agencies, new legislation would allow popular choice to entitle elected healers to tax-supported health jobs. Instead of submitting their performance to professional review organizations, new legislation would have them evaluated by the community they serve. Such guarantees against the medical support of a sickening industrial system would set the stage for the practice of health as a virtue.41

Here again, there is ambivalence about technology, but the idea of new technological advancements potentially contributing to the wider social agenda he supports remains open. 1984 was still fifteen years away and Orwellian angst did not feature in his perspectives on technology’s influence in human society. Working in poorer parts of Latin America, his awareness of information technology was shaped by images of the flashing lights and punched cards of the 1960s mainframe. He writes about respectful technology:

Tekne–the art that produced the first type of tool–was a measured tribute to necessity and not the road to mankind’s chosen action.42

DNA features nowhere in the book, even nearly twenty years after its discovery–the scientific examples are from the times of the Industrial Revolution:

The loss of a normative human condition not only introduces a newness into the human act but also a newness into the human attitude towards the framework in which a person acts if this action is to remain human after the framework has been deprived of its sacred character. It needs a recognized ethical foundation within a new type of imperative. This imperative can be summed up only as follows: ‘act so that the effect of your action is compatible with the permanence of genuine human life’; very concretely applied this could mean: ‘do not raise radiation levels unless you know that this action will not be visited on your grandchild’. Such an imperative obviously cannot be formulated as long as ‘genuine human life’ is considered an infinitely elastic concept.43

Illich’s prescription is for greater personal autonomy. In his book, Deschooling Society, discussed further in the section below on education, he argues for information systems that, when read today, bear strong resemblance to those now characteristic of the World Wide Web. He argues for systems that build a bridge from knowledge and information held and regulated behind a protective barrier of privilege and professionalism, to systems accessible and under the governance of autonomous citizens. He does not make this case in the context of medicine, but the quotations cited seem to indicate his thinking was in that direction.

The perspectives that I have introduced here are contrasting and extreme views of the current trajectories of health care services. Information utility for health care needs to steer between Illich’s fear of disempowerment of citizens at the hands of industrial and commoditized medicine, and a Novacene surrender to machine intelligence. There can be common and open ground between these extremes, and it is this that I seek to alight on in Part Three of the book.

Genetics and Genomics

When one thinks of ‘information explosion’ in the context of health care today, the data generated by the unfolding story of human genetics and genome science must surely qualify for that moniker. It is hard to give a meaningful overview. So much is changing, and so little time has elapsed in which to gain perspective. The evolution of bioinformatics of the past thirty years has transformed life science and is transforming clinical science, championed from early days in the UK by the immunologist John Irving Bell, Regius Professor of Medicine at Oxford. Many measurements are made, and words written, but it is still early days in relation to the promised benefits for human health and wellbeing that these portend and foretell. It would be interesting to know how Illich would have interpreted the rise of this new science and its impact on his appraisal of medicine and health care. The opportunity it opens for personalized medicine is surely one step towards greater personal autonomy and choice in relationship of patient, professional team and health service.

A diagram showing the interaction over time as factors - listed here as ‘exposure’, ‘Inherited genetic variability’ and ‘chance’ - causative of disease (After Greaves, 2001).

Fig. 7.6 The interaction over time of factors causative of disease–after a lecture of Melvin Greaves, 2001. Image created by David Ingram (2003), CC BY-NC.

QrCode encoded link to a high quality image of figure 7.6.

Melvin Greaves has been a highly regarded cancer scientist throughout much of my career. Close friends have worked with him. It has been a pleasure to listen to his lectures, to understand the pattern of onset of disease that he describes. I jotted down this simple diagram (see Figure 7.6) that he showed in one of these lectures. It concerns how disease arises, not how it may be treated, and depicts the interaction of variability in the genetic inheritance of DNA, with exposure to harm in the living environment and chance happenings in life. It conveys important truth to counter overly zealous instrumental approaches to health and disease. There are echoes, here, of Jacques Monod (1910–76) and his pre-bioinformatics perspective of chance and necessity in life.44 There are also echoes of the recurrent ping-pong of perspectives on the interactions of ‘nature’ and ‘nurture’ in the development of living beings, digging deeper into issues of gender, ethnicity, environment and social inequalities, as determinants of health.

Three basic sciences and their associated technologies have illustrated the advancing power of medical science of the past one hundred and twenty years and its dependence on information technology. These are imaging science, with its origins in physics, mathematics and computer science; pharmaceutical chemistry; and molecular biology. Each has traced a path alongside the evolving capabilities of information technology, into new methods of measurement, analysis and intervention, their translation into everyday use through the industries that support them, and the people and health care organizations that use them. In prospect, today, are two newer technologies, emerging rapidly from their university origins seventy-five years ago and embodying ideas and methods from computer science, mathematic and engineering control systems. These are machine learning and robotics. Overarching these advances is the grand challenge of data integration in the context of ethico-legal records of care.

The discovery of the structure of DNA came from applied physics, and of its cellular mechanisms from applied chemistry. In amassing data from these experiments, a new world of understanding of biological structure and function arose, at the level of cells and signalling between cells, and in microbial systems of gut flora and the natural environment. Emerging from this scientific revolution is a new technology of synthetic chemistry, whereby biologically active molecules are created from building blocks of component chemical structures. This is rather like early electrical circuits being soldered together from discrete component resistors, capacitors, inductors, rectifiers, valves, transistors and the like, and then grouped within integrated circuits for higher level functions such as electrical signal processing.

Nearby to me at UCL, Janet Thornton, herself a traveller from physics into biology and bioinformatics, led the way in formulating new classifications of the folding structures of proteins. She has memorably described bioinformatics as the core discipline of biology. Bernadette Modell, a luminary figure in the WHO context in her study of the burden of inherited disease, and especially thalassaemia, pioneered information systems with our jointly-supervised doctoral student, Matthew Darlison, to bring knowledge of genetics to the affected family and patient communities.

Experimental methods of genetic analysis have enabled study of the propagation of inherited traits across generations of rapidly reproducing living organisms, such as plants, flies and yeast cells, and of patterns of disease in families of human subjects. These now connect through the databases and analytical methods of bioinformatics, to provide frameworks for the characterization of sequences, structures and functions of chromosomes, genes and proteins of living systems. Such databases enable identification and tracking of significant marker sequences within the genome, in determining pattern and progression of disease, and related personal risk factors, for individuals and their families, and across generations, influenced also by non-genetic inheritance pathways.

The application of this bioinformatics discipline in pharmacology is developing rapidly, to help improve treatments. Related new information services are being created at the population level, to assist study of the aetiology and treatment of disease. Again, the challenge of integration of this data within a coherent care information utility is considerable. The field is prey to exploitation and fragmentation when these methods are adopted and scaled within new industries that operate outside effective governance and regulation frameworks, positioned to protect citizens from exposure to harm and support them when harmed. The Web is increasingly populated with advertisements for pills that are being sold as mitigation for genome-correlated personal risk factors. Scientific trials involving yeast and drosophila can span their short-lived generations in the laboratory, but we are not yet quite ready for clinical trials of such interventions spanning long-lived human generations!

Education, Competence, Accountability and Risk

Education is the acquisition of the art of the utilization of knowledge.45

This Whitehead quotation rings true in highlighting issues of education policy, today, as emphasis shifts from learning facts, to learning how to access, interpret and use facts and methods, within contexts of new, real and virtual worlds of knowledge and experience. The primary schools of our grandchildren are ahead of the older generation in this. It will fall to them to see off ‘fake’ and ‘alternative’ facts in the anarchy of today’s World Wide Web.

As with health systems, education systems are changing rapidly as the world turns upside-down in its transition into the Information Age. We are healed and kept well by clinicians and carers, and we heal and care for ourselves as well. We are taught by educators and educate ourselves, as lifelong learners. The balance of personal and professional, in education as in health, is a rapidly shifting balance in the Information Age.

Education connects with the philosophy of knowledge and mind. It connects with developmental neuroscience and psychology. We learn as we grow and as we go. Education connects with the assessment of taught disciplines in schools, colleges and universities, and with the assessment of work performance in profession, craft and trade, where the emphasis is on apprenticeship and learning on the job.

In the first half of the twentieth century, Maria Tecla Artemisia Montessori (1870–1952) and Jean Piaget (1896–1980) focused attention on the developmental autonomy and psychology of the child. Kurt Hahn (1886–1974) and Alexander Neill (1883–1973)–remarkably connected dates for these connected educational pioneers–focused on residential learning communities of secondary education at Summerhill School and Gordonstoun School. In tertiary education, there have been many shapes and sizes of new institution, and networks of institutions. The Adult School movement, and then the Open University, in the UK, brought new educational opportunity in adult life. In retrospect, the World Wide Web seems quite close to Illich’s prescription for a ‘de-schooled’ society. He championed the personal autonomy of learners and hated schools as much as he did hospitals! It would be interesting to hear how he would have reframed his prescription for education, given the concerns about regulation of quality of Internet content and learning resources that have arisen.

Today, we observe and listen to our teachers and consult libraries and other learning resources. We survey, experiment and practise. We learn alone and we learn with colleagues and in groups and communities. Some learning is easy, some is hard–at best enjoyable and motivating, at worst, prosaic and onerous–a mix of inspiration and perspiration. We express and demonstrate our learning through assessments focused on mental and practical articulacy, fluency and capability in the execution of tasks. And much of this through the Covid pandemic has taken place online and in connection with educational software and electronic learning resources.

Albert Einstein (1879–1955) purportedly said that ‘It’s not that I’m so smart, it’s just that I stay with problems longer’. For smartness, read intelligence, and we then enter the realm of what we value and measure when assessing and grading what is demonstrated through learning. Are we intelligent–what does that mean? What intelligence quotients and other quotients of ability and learning, are worth their salt as metrics of assessment–valid, reliable, reproducible and fair? What is emotional intelligence and how does this fit alongside? Should we be seeking more wide-ranging assessments, indicative of how we are capable to connect and contribute, over a lifetime of roles and opportunities? For example, valuing the capacity to negotiate and balance between conflicting viewpoints and imagine creative solutions to complex and contentious problems. This takes us into another set of questions–about judgement, ethics and wisdom.

The human conundrum that is wisdom exercised George Bernard Shaw, in the dialogue of humble waiter and pompous lawyer, in his play You Never Can Tell (1897). The waiter’s gentle and polite riposte to some of the lawyer’s hubristic posturing–along the lines of ‘Well, Sir, if I may say so, if that is wisdom, then so much the worse for wisdom’–rings in my mind from my school days’ monthly trips to matinee performances for schools at the Old Vic theatre in Bristol. There seems not much more that can be said about wisdom, than is said in such literature!

Transition into the Information Age has placed these perennial questions and concerns about education and practice under a new microscope and within a wider macro-scope. The education scene is adapting to changing needs, embracing new opportunities and resources for teaching and learning, and challenging the status quo for everyone involved–students, teachers, institutions and professions. As with health care services, there has been a wealth of central initiative and local innovation, contained and directed within new approaches to audit and accountability. A few new mirrors have been added to this kaleidoscope, to fragment its images! An NHS University was established in December 2003 and quickly abolished, in 2005!

Medical and Multiprofessional Education

For successful education there must always be a certain freshness in the knowledge dealt with. It must be either new, in itself, or invested with some novelty of application to the new world of new times.46

Medical education has experimented with the division of phases between life science and clinical education, and with parallel and connected flow between the two. It has explored problem-based learning, drawing together different disciplines and ways of thinking, to address a specific clinical problem. It has thereby long recognized the concept of grand challenge, bridging and uniting disciplines in focus on overarching clinical problems, where solutions may embody them all. Such challenges may be intractable, but they are unavoidable.

Information overload in the curriculum, as in everyday clinical practice, has risen steadily alongside the explosion of knowledge and diversification of specialism of health care. In medical education of the 1970s, this had become a significant concern in context of student workload, as the figure based on Anderson and Graham’s paper indicates (Figure 7.7).47

A presentation slide reading: Educational implications - information overload / The traditional curriculum implies that students are expected to learn facts/concepts at rates as follows: basic science - 24 facts/concepts per hour, clinical medicine - 9/hour, excl practical skills, modern languages - 6/h, OU half credit in statistics 10/h, Desirable rate of learning - 4-6/h (Anderson, J. And Graham, A. ‘A problem in medical education: is there an information overload?’ Medical Education 1980, 14: 4-7).

Fig. 7.7 Some early comparative metrics of the scale of factual content in the curricula of different first-degree subjects. Image created by David Ingram (2010), CC BY-NC.

QrCode encoded link to a high quality image of figure 7.7.

Assessment and regulation of clinical practice has faced wide-ranging challenge and the nature of professionalism is changing, more widely, as discussed in Chapter Eight. The mantra that assessment drives learning became the basis on which to define, constrain and regulate courses of education. Learning objectives for a course of study express what the student will be expected to be able to demonstrate and do. First work out and set out how you will assess, and then use this to define how you will teach and expect students to learn. This approach provided a more explicit focus of educational method in a field progressively overloaded with information, and where intra- and inter-observer variability in the assessments made was known to be high, and thus potentially unfair. Such a framework defines a common learning landscape for student, teacher, employer and regulator. Such clarity and predictability may, of course, come at a price of stifled creativity–an overly regimented factory is an unlikely place to find the freshness and novelty of education that Whitehead deemed essential.

Methods of formal assessment of clinical education and performance in practice have been consuming academic and professional issues throughout my career. Called upon to assess students, the exam boards of clinical medicine–that I watched coming to and fro and heard their chat–were motley assemblies of crusty and talented folk. The top third and the bottom third of students being assessed pretty much defined themselves, but the middle ground was argued over, both vehemently and imprecisely. Opinion was rife! Assessment in mathematics and science tends to be a bit more precise and reproducible, and rather more dull!48

The formal assessment of knowledge and skills features increasingly in medical education and national professional regulation of competent and safe practice. I have been close to three national pioneers and leaders in these areas: Jane Dacre, Lesley Southgate and Charles Vincent. I described the context in which our worlds aligned for twenty, sometimes tumultuous but always creative years–for all of us, in different ways–in Chapter Four, and reflect on this further in Chapter Nine. Here, I revisit the connections which were made then, that shaped my thinking and work, towards the creation of future information utility for health care.

Skills and Assessment

Jane Dacre worked with me at Bart’s to establish the first UK clinical skills centre, established jointly between its medicine and nursing colleges. As medicine has become more accountable, so issues of rigour of assessment have come to the fore. Is it measuring the right thing? Is it measuring accurately? Is it reproducible among examiners, or with the same examiner at different times? Such issues permeate throughout assessment. Assessment has moved into the Information Age with the automation of multiple-choice question banks. The students’ answers are analyzed and grouped to provide statistical summaries that guide the setting of student population norms and grading boundaries and help in improving the rigour and usefulness of the tests themselves.

The mantra of assessment driving learning connects with maxims on management from the business world, credited to Peter Drucker (1909–2005), the Austrian modern-day guru of business management. Here are some of the ways he is quoted:

If you can’t measure it, you can’t improve it.
Management is doing things right; leadership is doing the right things.
The best way to predict the future is to create it.
There is nothing so useless as doing efficiently that which should not be done at all.

These are latched onto in educational contexts, as sound bites, especially in the context of vocational education. Creating the future is the focus of innovators and their leaders. Innovation exists in the Drucker domain of discovery of the right things to do. Management of education is about good and efficient processes and is a gatekeeper role. Support for innovation is a wider role of leadership. Innovation in and management of education and assessment coexist, but embody different passion, perspective and leadership. Innovation operates, importantly and consequentially, to disrupt status quo. It is resisted by interests that it challenges, as highlighted in Chapter Five. Doing right things involves letting go of what once might have been thought right, but now no longer is, and recognizing that if we are not doing the right things, measuring them can risk doing more harm than good. Some cans of worms may best be left unopened. Useful disruption without harmful destruction is a difficult balance to strike.

Educational assessment method morphs into management strategy–the model of assessment becoming the model of management of learning. It risks becoming a game–between students and teachers, and between teachers and their institutions and regulators. Marshall McLuhan (1911–80) wrote of the medium becoming the message. The medium of assessment becomes the message whereby we communicate about learning. It is a necessary, but surely not a sufficient message. All this will have to come to terms with ChatGPT!

Practice and Performance

Innovation in methods of assessment of medical education, such as the clinical skills Objective Structured Clinical Examinations (OSCEs) that Jane Dacre pioneered in the Bart’s Clinical Skills Centre, has widened into the regulation of competent clinical professional practice. This trend has been mirrored in more stringent regulation of other professions and services, such as those of plumbers and electricians, and the certification of their competency and work conducted.

Independent audit of quality of care delivered by practitioners has become a matter of judicial determination by professional regulatory bodies, such as the General Medical Council in the UK. These are often difficult and emotive matters to decide within a legal framework, as in the case of a junior doctor colleague, known to one of my children, who mistook the labelling on a chemotherapy drug package late at night, when tired and alone on duty, and injected a drug by the wrong route, resulting very sadly in the death of the patient.

Lesley Southgate, an East London GP who worked with me in the foundational Good European Health Record (GEHR) research project that led to openEHR, and joined me, along with Jane Dacre, in establishing the Centre for Health Informatics and Multiprofessional Education (CHIME) at UCL, as described in Chapter Nine, became a leading national player in work on behalf of the General Medical Council (GMC). This became the basis of nationally mandated, albeit sometimes hotly contested and resented, procedures for the review of referred individual doctors’ clinical competence to practice and the adoption of more formal requirements for every practitioner to keep up to date in their field, by participating in continuing education and training programmes. Lesley was a doughty warrior and political campaigner for East End primary care and medical education. She worked tirelessly to deliver a very difficult and contentious brief, with her team set up within the comparative calm and protection of the CHIME academic department at UCL. She was well recognized by the leadership of the GMC for this work. She and Jane were both elected to lead their respective Royal Colleges and nationally honoured for their immense contributions.

Risk Management and the Law

The risk of harm being caused to patients because of the clinical interventions they receive, along with the collection of data and presumption of accountability for harm, has become of increasing concern in the transition of health care into the Information Age. This trend was highlighted in John Swales’s lecture, when head of research and development for the NHS at the time of its fiftieth anniversary, as described earlier in this Chapter. Medical intervention can, of course, sometimes at best be a palliative measure, or of unlikely success. The death review meetings of my early career were where these matters were handled as part of clinical team culture and practice, within a protected and trusted hospital citadel. They were seriously undertaken, and sometimes uncomfortable, occasions–I lived in that community and heard about them. They sought to learn from experience, improve practice and avoid mistakes–which, again, are inevitable on occasion, with no fairly-accountable responsibility or blame to be attached.

Over time, such culture has translated towards a more adversarial one of cover up and avoidance of blame, within and beyond the clinical community. Medical malpractice and organizational failures in duty of care became more litigious matters. And these concerns now attract wider public scrutiny and over recent decades there have been notable UK public inquiries into the quality of health care services and the practice of individual clinicians, teams and organizations. Some, such as the Bristol, Shipman and Staffordshire enquiries, became of major national focus and concern. Attention was focused on causes and remedies.

Statisticians combed data provided to the 2000/01 Bristol Inquiry into excess deaths in children’s cardiac surgery.49 They found persuasive evidence that this reality could have been discerned much earlier, given good quality data on surgical outcomes, combined with methods for analyzing such trends that were in routine use in other sectors of the economy, for example in the quality control of manufacturing systems and surveillance for significant trends in drug trials. Cardiac surgeons, nationally, were prompted to lead efforts towards making such surveillance more feasibly a matter of daily routine.

The psychologist Charles Vincent pioneered the study of clinical risk management, working in the Psychology department at UCL, and subsequently at Imperial College, in London. Through his colleague Pippa Bark, I drew this theme within the scope of the health informatics graduate programme I created at UCL from 1995, seeking to connect issues of data and data management with the culture and practice of risk management. As information utility becomes a more coherent, pervasive and connected reality, the aspiration for closer awareness of potential adverse risk and its mitigation will become more tractable. As things stood at the time of the above national enquiries, the political response was to impose greater requirement for central reporting of critical incidents, from within widely disparate and non-coherent information ecosystems. I have mentioned elsewhere the proliferation of burdensome computer-based reporting systems that were created. As in so many areas of sought for quality improvements, coherence of the underlying data models is a sine qua non of successful method that can be implemented efficiently and with least possible operational burden.

The invocation to ‘do no harm’ dates from the time of Hippocrates (c. 460 BCE–375 BCE) and the invention of medicine. A future information utility can support that goal and help to enhance a culture that guides, supports and improves practice, as a shared professional endeavour, rather than simply providing chapter and verse in the reporting of failures, which exacerbates the now prevalent culture of blame and blame avoidance. There is much continuing effort towards improvement. A high proportion of clinical risk litigations revert to problems of record keeping and continuity of care. Information overload is also implicated.

In the 1980s, research on human decision making was being presented in medical contexts at meetings of the Royal College of Physicians Computer Group. One such talk showed how humans could cope with, refine and improve clinical decisions, drawing on up to seven (the magic number) variables, but thereafter their capacity diminished, and decisions worsened. In the accumulating research literature assembled by Charles Vincent, information overload was demonstrated as a risk factor in acute medical situations such as intensive care unit (ICU) management.50 Standardization of interventions conducted in situations of heightened risk and uncertainty, such as the Advanced Trauma Life Support (ATLS) protocols for induction of trauma life support in Emergency Departments, have been shown to improve outcomes for the patient.

Central Roles of a Care Information Utility

The formulation and regulation of personal ethical and legal rights and responsibilities has become increasingly complex in the Information Age. Contending perspectives on privacy, ownership and openness of personal and population data have become contentious issues of debate and the subject of major and evolving legislation. They are put through the wringer and play out more openly in public places, affecting all sectors of the economy and its products and services. Coherent and consistent thinking about these issues matters more as technology advances and interventions become more powerful.

Clinical intervention will always involve potential benefit weighed against risk and cost–not much in life does not. Clinical assessment, whether deployed in health care delivery, academic examination, professional peer review or judicial proceedings, must inevitably weigh evidence and make judgements about probabilities. These judgements need the best possible scientific underpinning, but they also rest on issues of trust–in knowledge, data, expertise and people. How and why patients trust their professionals is also important to understand and appreciate. Trust is vital in clinical practice, and it is a two-way street–the rights of both citizens and their professional carers must coexist fairly alongside their just governance in the public domain.

Much of medicine is temporal. Much is about giving time and opportunity for the body to recover and heal itself. Much is a balance of risk, trial and error, often characterized by a sense of ‘wait and see’. The patient is a key player, over time, in their own maintenance of, or recovery to, good health. Mutual belief and trust in the relationship with their supporting professionals play a key part and require time devoted to them. Regulation of such a personal domain as health care, especially when things go wrong, carries risk of punitive litigation, leading to defensive practice, obscuration, deception and blame. These can extend into cultural miscues, misunderstandings and mistrust.

The current time and capacity constraints on clinical professional practice and working life do not seem consistent with good and feasible solutions to this ongoing and increasing set of interlinked problems. New common ground is needed, on which to adjust and balance the extraordinary and taxing combination of challenges that health care teams face in their everyday lives and careers, in their own struggle to find a sustainable mix of reasonable expectation and achievable reality. This impasse has embroiled both the professional teams and those they serve, in the overloaded health systems of today. Necessary reinvention and reform of services needs to be rooted in education about the changing nature and culture of health care teamwork and professionalism, and of the roles and responsibilities shared. There is much stirring in this direction.

And it is at the centre of the current imbalances that a new kind of information utility is needed, to help towards new fairways and fair ways of working, focused on outcome and value, rather than process and cost that have typified the runaway insolvency of Industrial Age medicine. Ways that balance and interface consistently, continuously, effectively and fairly, and that relieve undue or unnecessary burden on all sides. Health care services must advance alongside individual self-care. Assessment and regulation of professional education and practice must advance, likewise, alongside individual self-assessment and peer-assessment of learning, skills and competencies. This trajectory must join coherently with methods and resources for continuing professional education and quality improvement of services. Taken together, these will help to reshape what has become an unfairly punitive and defensive culture and burden on professional practice. This is reflected in the fragmentation and discontinuity of the current landscape of services. Its infeasibility has been exposed and exacerbated in the transition into the Information Age and must be put right in the Information Society.

These are not new thoughts. Let us look back again, in the wider context of assessment of skills and competences, to an early report of the Congress of the USA Office of Technology Assessment (OTA). This was prepared jointly with the Association of American Medical Colleges (AAMC) and entitled ‘Computer Technology in Medical Education and Assessment’. It contains many interesting observations and highly pertinent, although still largely unmet, expectations of the future role of the computer in connecting a continuum of relationships between education, assessment of professional performance and outcomes for patients.

The use of computers in education and assessment inevitably will be linked to their uses in medical information systems. Such linkages will allow, if not force, the formation of new relationships between segments of medical education and assessment continuum, through accumulation of large databases on student characteristics and performance, on physician and institutional performance in patient care, and on patient outcomes following treatment. These databases could serve as the thread of continuity between portions of the continuum. They could provide more objective and quantitative feedback mechanisms from active practice to education and assessment.51

It further emphasized how a focus on standards and standardization could be expected to connect improved medical information systems with medical education of the future.

Currently the best measures of competence in learning do not necessarily predict good performance in practice. Patient care assessments depend on comparison with peers using standards (processes that should be followed) or empirically determined norms (the average care provided). Computer technology could be used to improve the linkage between medical education and patient care through the provision and maintenance of more specific and objective databases for diseases and treatments. In addition to providing better data for generation of standards, computer databases could allow better comparisons of standards and norms of care with actual patient outcomes. These data also could permit the development of computer consultant systems. Feedback from medical information and health data systems could provide continuous updating of the databases.52

Fifty years have passed, characterized by failure to show how to turn the promise into a reality. The idea and implementation of the care information utility, as set out in Part Three of this book, is, in significant part, about the practical realization of these now tractable and achievable goals.

Research

As we have seen, the role of the computer is still at a transitional stage in the pedagogy and assessment of learning in medicine and health care. By contrast, it has more rapidly transformed the scope, methods, scale and infrastructure of research, but not without difficulties specific to the health care domain. My songline has travelled widely across this changing landscape, over five decades.53 In keeping with the increasing scale of data capture and broadening scientific and geographical connectivity of clinical research of those years, I was close to many teams working to design, install, program and operate ever faster and more extensive computational facilities, link them across networks and enable them to handle and process ever larger data stores and computational loads.

The UK CCLRC was established to draw together and support such research endeavours across disciplines. Its home base is the inspiring national science campus at Harwell, near Oxford, and there are similar and closely connected computational science communities in many countries. The USA has the Oak Ridge National Laboratory, which is dedicated to ‘solving big problems’ and describes its ‘greatest strength’ as the people from sixty countries working there. I describe my connection with CCLRC and its great teams of scientists, in the science and computation section of Chapter Three, on observation and measurement.

Through these integrative scientific endeavours, built around shared computational methods and resources, scientists have joined forces to the mutual benefit of their respective communities, enhancing the kinds of research they are thus enabled to pursue. In medicine and health care, with their special responsibilities for handling personal and confidential data, activities have tended to remain fragmented within non-communicating silos of data. Considerable, both technical and organizational, difficulty was experienced in safely connecting the IT systems used for the academic and clinical service roles of clinical researchers, who sometimes ended up working with several personal computers connected on different firewalled networks.

The challenge of integrating data from these disparate silos is well illustrated by this slide of my clinician colleague, Richard Begent, which he used to illustrate the wide-ranging requirements of his cancer research (Figure 7.8).

A presentation slide showing a diagram of the cycle of ‘Integration through schematics. The slide is entitled ‘Integrating discovery and evaluation of new treatments’ and the cycle moves clockwise: ‘Patient / Tumour / Target / Design of therapy / Generation of therapeutic agent / Characterisation of therapeutic agent / Testing in experimental models / Devising therapeutic strategy / Testing in individual patients / Testing in populations’ (Begent, 2001).

Fig. 7.8 Integration of research and practice through informatics–after a lecture of Richard Begent, 2001. Image created by David Ingram (2010), CC BY-NC.

QrCode encoded link to a high quality image of figure 7.8.

In such situations, there was much scope for research teams that had enjoyed, and preferred, an independent working life, and were reluctant to spend time in pooling their efforts for the common good, to shelter under different organizational firewalls, and making claims of exceptionalism! One way or another, a more cost-effective and scalable approach was increasingly necessary, and achieving it was a human as much as a technological challenge. I relate several of the local stories, here, to illustrate how local and wider national research issues enmeshed, and organizational development became a central focus and concern.

One of my tasks as a member of the biomedicine executive group of UCL was to create and populate a more cohesive, resilient and efficient common ground of IT support services, drawing together members of some ten long-established separate small teams. This involved gaining their trust and commitment, and permission from the senior academic leaders they worked for, some fearing loss of autonomy and the funding for IT that they enjoyed within their separate domains. They had different needs for connection with clinical services in their different local NHS Trusts. My task also involved building a good relationship with the corporate IT support services of the university, that already provided a wide range of computing services pursuant to the University’s central information strategy. I was a broker among highly intelligent, experienced and successful teams and leaders, where mutual trust was not always the order of the day! The different team members identified with and felt protected within the different local departments, faculties and institutes that they worked for. We met over two years to articulate and create a shared mission and common approach to the IT support services.

In parallel, we discovered that within the largest of the UCL-linked NHS Trusts, UCLH (University College London Hospitals), there were some three hundred separate small computer systems in operation, funded locally and often running software that was idiosyncratic, poorly documented and sometimes of unknown design. I discovered this through the dissertation project of one of my Master of Science (MSc) students in health informatics, who went on to lead cancer information services in a national research institute. This situation was common to other major medical schools where I enquired. The NHS side of the research challenge we faced was clearly, in itself, a highly fragmented IT domain.

A few years before embarking on the IT support services initiative described here, a similar effort was devoted to creating a network of the many clinical research investigators and their teams, based in the eight constituent NHS Trusts linked to UCL, later called UCL Partners. The UCL Chief of Medicine of those times, Leon Fine, and the Director of Research and Development at the Institute of Child Health, Great Ormond Street, Al Aynsley-Green, asked me to join them in establishing a Clinical Research Network Board to oversee this project.54 The Trusts involved were sometimes quite fiercely independent institutions! The Royal Free Hospital in Hampstead; the UCL Hospitals in Bloomsbury; the Whittington Hospital in Archway; the Institute of Neurology at the National Hospital for Nervous Diseases site in Queens Square; the Institute of Child Health at the Great Ormond Street Hospital site; the Institute of Ophthalmology at the Moorfields Eye Hospital site; the National Centre for Orthopaedics at Stanmore; the Eastman Hospital Dental Institute in Gray’s Inn Road. All told, an annual turnover now in excess of five billion pounds. It naturally fell to me and one of my IT support team colleagues to create a searchable database of investigators and projects. Another taxing human exercise! Much of organizational development these days is driven by and revolves around innovation in information systems. That involving health care is no exception, but it is typically a harder task because of its multiple interconnections across academic and health care service domains and constituencies!55

Extending from these local roles at UCL, I was drawn into efforts to tame the wider computational challenges posed by large-scale scientific research of the Information Age, as a member of the Medical Research Council’s informatics board and then representing it on the national e-Science Programme Board. This drew together representatives from the Biotechnology and Biological Sciences (BBSRC), Engineering and Physical Sciences (EPSRC), Central Laboratory of the Research Councils (CCLRC) and Economics, Environment, and Social Sciences (EESRC) research councils and connected also with the Wellcome Trust and the NHS Information Centre. Some of the eScience ‘moonshot’ initiatives, to use the UCL economist Mariana Mazzucato’s, term, spread over five or more years and progressed very well. Alongside worldwide efforts, they pioneered a new generation of networks and grids of computers, leading over the next decade to the technology underpinning the commercial Cloud data centres and computational resources of today, including those of global corporations such as Microsoft, Amazon, Google, Apple and Alibaba.

This research computing community grew to span disciplines and continents. In contrast with the medical, health care and social sciences, other sciences did not face the challenge of coherent and confidential linkage with data obtained from operational health care systems. Such clinically linked research lagged behind as a result but has been investing to catch up in recent years. At the same time, there were other differences and rivalries in play. For example, the wider eScience research community sometimes envied and somewhat resented the virtuosity of its physics community membership, which could always be relied on to have the most demanding computational challenges to put forward, and the best worked up and coordinated bids for national funds, to create and run the computer infrastructure their science required, and for these to be adopted as a top national priority!

Now, everyone has the equivalent of what was the largest mainframe research machine at the start of my academic career, in their laptop, connected via the Internet to the massively more powerful computational resources and data warehouses of today. And the software that runs within this infrastructure has matured beyond recognition. In similar virtual proximity are the electronic libraries and archives of research and publication, also worldwide.

Realization of the undoubted research potential of the NHS has also been hampered by the lack of a semantically coherent information architecture, of the kind that initiatives such as openEHR have been experimenting with, specifying and disseminating. Moreover, the wider health care IT domain is a very substantial commercial marketplace which has, unsurprisingly, long been kept under the watchful eyes of many powerful industry interests. Companies retain control through contracts, Trust by Trust, covering the use of the proprietary information infrastructures that underpin their health care products and services. The operational clinical data arising in everyday health care delivery is thereby managed by the health care services concerned in a proprietary manner. And this inevitably leads them to have a close dependency on the particular companies they contract with, and the hardware and software technologies employed in their systems. This is not a good position from which to sustain lifelong records of care.

The governance and management of personal health care data, which are seen much more, nowadays, as owned by the citizens they concern, and of the related software applications, of both public and private provenance, that create, store and process that data, are slowly becoming seen as separate and separable concerns. The separation of these concerns, sustained on the basis of global and public domain standardization and governance of care records, is central to the future care information utility that this book foresees.

Information Policy as a Wicked Problem

Gladstone […] spent his declining years trying to guess the answer to the Irish Question; unfortunately, whenever he was getting warm, the Irish secretly changed the Question.56

Joking apart, we’re all a bit like that! Neither we, nor the Irish, for that matter, are, or would like to think we are, particularly wicked! But this joke is funny because it reveals the human side of many a difficult, seemingly intractable, human dilemma. These have been called ‘wicked problems’. In the next sections of this chapter, I focus on the framing and history of national policy for information systems and technology that support health care services. This has been termed a ‘wicked problem’ and this section is about wicked problems in general, and how health care policy fits the bill. Horst Rittel (1930–90) and Melvin Webber (1920–2006) used the term to characterize socio-technical problems that arise in social policy formulation.57 Policy for health care IT ticks all their boxes of wickedness.

In the paper, they compare these wicked problems with the ‘tame’ problems of science–a bit of special pleading, perhaps! They characterize the wicked problem as one lacking definitive description, and for which the public good to be addressed by solving the problem is always disputable within a pluralistic society. Likewise, they argue, there can be no objective principles of equity involved in weighing solutions. There can be no correct or false answers, and only by imposing ‘severe qualifications’ on the definition of the problem can solutions be considered in any sense optimal.

But how far would such a characterization be out of place in describing the riddles that physics wrestles with in delving the depths of the ‘What is reality?’ question? Leaving aside this piece of, no doubt eclectic, special pleading on my own part, if science is seen as posing ‘tame’ problems and social problems are ‘wicked’ ones, health and care, being problems of science and society, and the engineering that joins them, combined with the anarchy of transition to a knowledge and information-based economy of global reach, must qualify at the super-fiendish end of the Sudoko spectrum of wicked problems! Quoting from another context entirely, a wicked problem might be described as ‘a riddle wrapped in a mystery inside an enigma’–I leave that uncited, not wishing to stir unwarranted association! But I smile to note the association, here, of Rittel and riddle!

Rittel and Webber go on to argue that only by pursuing and reviewing alternative solutions can the nature of a wicked problem be understood, and a solution refined over time. Such problems are never completely solved, and solutions adopted require adaptation in the behaviour of the community addressed. In such connected worlds of policy and practice, it is unsurprising that the wicked problem lacks clear ownership and leadership. Any party aspiring, conspiring and perspiring to take control must not fail, and the way in which they tackle the problem is as important as how it is tackled.

This seems the right place in the book to emphasize a crucial connection that runs throughout. This is the inextricable interconnection of the approach taken to the resolution of wicked problems and the methods, teams and environments whereby they are evolved and implemented. In the next two chapters, the principal focus is on issues of implementation. Elsewhere in the book I have several times connected how, in both science and society, the challenge and uncertainty of such enigmas are approached, both in tackling them (as in science) and reacting and adjusting to them (as in society at large). These concerns are different in kind, and therefore in approach, but there are commonalities, too. Bifurcation and emergent transition, complementarity and dualism, polarization and dichotomy, crop up in several chapters. They cropped up in the Introduction, illustrated from a luminary scientist’s perspective, in Robert Oppenheimer’s Reith Lectures. Writing at the dawn of the Information Age, he introduced the riddle of quantum theory, and the idea of complementarity, as an analogy from which to reflect on riddles of society and human values. His lectures resonate today, as we live through the Whitehead anarchy of scientific and societal transition that has ensued from those times. Similar concerns, from a societal perspective, were reflected in the quotations from Primo Levi and Voltaire, in the Introduction and Chapter Eight.

Leaders of science and engineering reacting to and pursuing solutions to their enigmatic problems, and leaders and citizens of society at large doing likewise, when faced with theirs, may well have to behave and choose differently. But not necessarily always so! Choices made on all sides will reflect beliefs and temperaments, as much as a more strictly evidential weighing of ideas. Those shaping ideas for how new approaches to the enigmas of social policy should be ‘led’ might find it illuminating to study Oppenheimer’s seminal lectures, reflecting his experience of how ideas and leadership play out in science and engineering, and reflect in society.

The history of the digital care record, exemplifying that of health care information policy more widely, might be fairly described as one of riddles, mystery and enigma! My maxim that the three top priorities of openEHR are ‘implementation, implementation, implementation’ rested on my implementation-focused approach to this wicked problem. As Rittel and Webber wrote, it is fundamental to the taming of wicked problems in social policy, although I demur, as above, from how they contrast them so emphatically with those of science. That feels like an unhelpful dichotomy. I prefer to think in terms of trifecta (as in ‘a situation in which you achieve three things’). I will expand on this idea in the following chapters. For now, the triple of the implementation maxim is emphasis of its priority, rather as the then UK Prime Minister once highlighted and stressed the importance of ‘education, education, education’.

Policy for an information utility conflates divergent interests and understandings of the purposes served by its information content and who is responsible for it, both as supplier and regulator, and of the scope, specification, supply and operation of its related information infrastructure. An example from health care is critical incident reports, whereby awareness and response to incidents of exposure of patients to clinical risk are monitored. These flow from the medical directorate of the local Trust where the incident occurs to the office of the Chief Medical Officer, nationally, and sadly, sometimes, into the judicial system, too. I remember this topic coming up in discussion with a UK Chief Medical Officer of the time, at a joint USA/UK conference on health care quality where I was invited to contribute. They told me of the thirty different structures and formats in which these were compiled and reported, from different computer systems across the NHS. The aggregated information carries risk of inaccuracy and bias because of divergence in the ways it is collected, collated and summarized. Perhaps this situation is now better standardized. This critical incident scenario has been mirrored in stories of Covid pandemic data being collated centrally, by cut and paste from very many submitted documents into a central spreadsheet.

Information about patient allergies is of potential relevance in many contexts, within and beyond the health service–in social care, education, ambulance, police and hospitality services, for example. A common definition of this information and a service providing and maintaining it as a common national resource, accessible wherever relevant to be shared, is a candidate standardized information utility. Known drug interactions form another dataset best kept consistent, up-to-date and easily accessible and integrable with other systems, wherever this knowledge needs to be used.

The challenge of achieving this level of coherent and useful standardization is considerable in the current diverse landscape of health care information systems. Some ten years ago, I was asked to chair a national board that oversaw efforts to standardize information about prescribing practice, seeking a common semantic framework throughout the NHS, accessible across different sectors of health care. The project started with a focus on primary care systems and brought together academic experts in prescribing practice, the suppliers of practice management information systems in use and the NHS team charged with managing the project. This mandate required some six suppliers of systems to adapt their software to comply with a relatively straightforward common data model. It took two years of extremely slow, tortuous and expensive work–a major headache for all concerned. Given a well-formulated service information architecture, there would again be good reasons and capability to make this a national utility, drawn on by all suppliers of systems. This process would much more easily enact and consume updates required over time, just as Apple or Microsoft update their operating systems online, for all their users.

We might compare this sought for information utility with a water supply utility–they both impact greatly on public health. Infrastructure for capture, purification and distribution of water supply, and drainage and disposal of rainwater and sewerage waste, created a healthier environment that contributed to the elimination of typhoid, cholera and other infectious diseases. The infrastructure is tangible and the water itself, chemically the same everywhere, whether in overwhelming or short supply, clean or contaminated, and collected from aquifer, reservoir, or river, or through desalination. H2O says what it is. It does not say how pure it is, reveal why it feels wet, or explain its surface tension. In its different forms, as liquid, ice and steam, it is always water.

Digitized information is all the same bits (or qubits), but its meanings are infinitely diverse. Data are captured in different types–integers, text strings, logical variables. They are grouped and annotated to convey further meaning and context. Knowledge bases, likewise, generate and communicate information that guides and supports decision and action. The purposes served by this information, the contexts of its use and the formalisms in which they are represented, interact within and between systems, and need to be accommodated consistently and coherently, throughout, when connecting and computing.

The nature of the information and the representational methods adopted for storing, interrogating and retrieving it, need to be consistent, clear and understood, whether engineered for use within a single system or shared among systems, more widely. Well-ordered or not, whether generically or locally applicable, the basis on which these representations are specified and constructed needs to be clear. And changes made over time, to remedy error and extend or revise functionality, need similarly clear provenance and governance. Where multiple architects design multiple systems, the specifications that they develop, and which the information engineers build from–syntax and semantics of the data and how it is being processed, using data models, information models and knowledge bases–all need to be consistent, coherent and declared. And formal terminology used needs to have its provenance and formal description, similarly declared.

All of this oscillates between the impossible and the very difficult, as my colleague Alan Rector notably described the domain of medical terminology. We grapple between the formalized and formalizable, with fragmentary or rudimentary formal method to underpin our efforts. This is a principal reason why much medical communication has persisted in narrative, written and diagrammatically illustrated forms, and word of mouth. It is why medicine is sometimes said to be the most fruitful domain in which to position and grow exemplars of innovations in computational methods, and the hardest in which to bring them to fruition. The complexity of meanings conveyed in health care systems and the understanding of the nature of health care that is assumed and embodied in the ways in which the systems are designed and constructed, is too great for capture within a mandated framework of policy–democratic or otherwise. It is an organic entity that is seeded, grows, emerges and evolves in theory and practice. Software production was once commonly described using the analogy of a waterfall, flowing from systems analysis and design to coding and product. But like the apparent upward flow of water in the Maurits Escher lithograph Waterfall (1961), this is an illusory process.58

Software cannot emerge along a series of waterfalls. Software standards, likewise, cannot emerge in declarative form, ahead of agile iterations and consensus process based on experience in use. Information engineering and information flow require a different approach. What we do at present is often, with some justification, called out as ‘imagineering’ more than engineering.

Consistency in these enterprises matters, but when we seek, or impose it, by taming the complexity of the wicked problem through narrowing the scope of enquiry or constraining the analysis of data collected, the meaning and relevance of results is inevitably diminished. It is a messy domain that defies narrow consistency. As Ralph Waldo Emerson (1803–82) famously wrote in Self-Reliance (1841), ‘A foolish consistency is the hobgoblin of little minds, adored by little statesmen and philosophers and divines. With consistency a great soul has simply nothing’. And as Erwin Schrödinger (1887–1961) translates in quoting a remark of a Spanish colleague, Miguel de Unamuno, ‘If a man never contradicts himself, the reason must be that he virtually never says anything at all’.59 There is also risk in moving too far in the other direction: by adopting a looser but more true-to-life scope for a project, it becomes progressively harder to fund and pursue, as it will likely not be considered ‘appropriate’, as a manageable goal or subject of enquiry.

In tackling wicked problems, the rationale of what was done, how and why, and how it turned out in time, is all important to document and learn from. It is lost all too quickly. Seeking to adapt from an unsatisfactory and unsustainable status quo, policy makers tend to commission backward-looking and self-justifying reports and proposals. These largely comprise lofty rationales of times past, airy pontifications about times future and hubristic policies and proposals for time now. They set in train successive, likely equally unsuccessful, eras, with new heroic figures to take them forward, through a new iteration of high-budget megaprojects, costly reorganizations and general resulting mayhem. This costs too much and is, too often, largely ephemeral.

And resulting code and databases too often appear like this:

A diagram demonstrating how health care information systems have become opaque and entangled - new discipline is needed to unravel them. The diagram appears as cogs - named ‘Clinical, scientific and population aspects’ and ‘Technical aspects’ - with bands emerging from them and collecting in a tangled mess at the centre of the diagram. The diagram states ‘This entanglement costs hugely, in many ways’ along with listing examples of what it impacts ‘confidentiality, safety, efficiency, flexibility, effectiveness’.

Fig. 7.9 An image of scrambled and non-coherent clinical data. Image created by David Ingram (2010), CC BY-NC.

Public policy makers who control purse strings have proven ill-equipped to manage this anarchic domain. There is hubris and lack of knowledge about the nature and complexity of computational methods and the precision and rigidity of computer systems and the underlying information architecture of clinical practice. They have been all too ready to place trust in commercial suitors eager for government money for their businesses, offering magical thinking to persuade funders that magical outcomes, of interest to these policy makers, will be delivered, but sadly, too often, not well placed to deliver value for money.

A lurid example from another field illustrates this reality. The planned UK West Coast main line railway modernization was at one stage reduced to a mandated budget by adopting an engineering solution that existed only in the minds of the bidding main contractor. This proposed to remove the very costly element of wired signaling circuits by adopting a non-existent, yet to be designed and implemented, wireless-based approach.60 The model of region-wide, single consortium, standardized hospital information systems, on which the National Programme for IT in the NHS was based, proved a similarly costly triumph of hope over experience.61 The Covid-19 Test and Trace service has been a more recent example, costing many billions, involving an interplay of technical, organizational and clinical factors, and demonstrating a limited capability to design and implement ambitious, unproven systems.

Overly ambitious, hubristic, ill-informed or just unlucky, policy makers become emperors with scant clothes, left throwing the dice of infrastructure decisions, and watching them roll to lucky or unlucky outcomes over time, perceived in retrospect as wise or foolish. Sometimes, small unforeseen and overlooked weaknesses magnify into dramatic emergencies, as with the Challenger Space Shuttle Disaster (1986).62 Sometimes, situations change so rapidly that the original objective and commitment becomes outdated, in timescale, resource and technology, before the envisaged infrastructure has come into use. The time constants of science, engineering, business and politics are often considerably out of sync.

The resulting confusion can lead to stasis or deadly embrace, with separate initiatives moving in different directions, combatting one another, and producing a vector sum of outcomes, near zero. Spending on infrastructure is an easy tap to turn off in times of financial imbalance, and a convenient tap to turn on when financial management prudence takes second place to the need to spend, in mitigation of economic decline and loss of employment. One often observes that in the abstract reality of money supply, spending is as much a matter of who wants and has the power to spend or not spend, as it is about availability of ready cash, or, in times of old, gold in the ruler’s bank vault! As Paul Krugman, the American economist, now declares, debt is not what it used to be–a bit like nostalgia!

Subsumed within the health care field is an abundance of examples of the wicked problem described by Rittel and Webber and the multidisciplinary and multiprofessional challenges encountered in tackling them. These are real dilemmas and I do not wish or intend to discount or disparage the pressures in play. In the next chapter, I describe the work of five pioneers I have known and worked with, who tackled this reality, head on, in widely different health care contexts, and with considerable success and acclaim. What was special about them? In Chapter Five, I made an analogy with the character of the innovators and innovations that drove steam power and powered industry in England of the eighteenth century. They had head, heart, hand and skin in the game. And they saw and grasped their opportunities. We need to encourage and better enable such people with the opportunity to engage more realistically with wicked problems.

The balance of life between work and leisure is changing. And in today’s world, scope and motivation for innovation resides in powerhouses of industry and commerce and the voluntary sector, as much as in universities and the public sector. The importance of volunteers has been shown in a new light during the pandemic. A great deal of what will make a difference in achieving better balance of health care for the future will reside in framing and responding to need much more flexibly and recognizing and harnessing those sectors, and the human motivations that drive them.

The banner and battle of Creative Commons is crucial in this regard. Traditional commercial processes have often proved too costly, and time consuming, and new web-based infrastructure is proving qualitatively and quantitatively more efficient and effective for creating systems that are both agile and scalable. This theme is developed in Chapter Nine. It is, for me, a lodestone in the quest for a care information utility fit for these divided and divisive times. Lodestones are natural magnets; they naturally align to attract, and, otherwise aligned, they can repel. So it is with Creative Commons; we need to understand the polarities and forces in play.

Information Policy for Health Care

Whosoever, in writing a modern history, shall follow truth too near the heels, it may happily strike out his teeth.63

The story now enters more sensitive and febrile territory, where politics, policy, and money reign–hence the quotation from Walter Raleigh (1552–1618).64 Some call government kitchen politics. A UK Prime Minister of the 1960s, Harold Wilson, advised people to stay clear of the kitchen, if they could not stand the heat. Raleigh’s quotation captures the ambivalent feelings about politics of many who are attracted to statecraft, while others appear relatively unaffected by kitchen heat. It is best not to set frying pans on fire, though, and when they are on fire, to clear the fat and put out the flames. Information and information systems have moved centre stage in the policy and politics of health care. A lot of money has flowed, and a lot of anarchy has reigned. Many initiatives flamed, fizzled and burnt out.65

A presentation slide entitled ‘Transition - information technology can facilitate a more patient-centred health care’. The slide shows inverted triangles depicting the transition from Industrial Age to Information Age medicine. In Industrial age medicine, Professional care (Tertiary, Secondary and Primary) is encouraged but costly. In Information age healthcare: individual self-care, friends and family, and self-help networks are encouraged and less costly, whereas Professionals as: facilitators, providers and authorities are discouraged and costly (After Jennings, Miller and Materna, ‘Changing health care’. Santa Monica: Knowledge Exchange, 1997).

Fig. 7.10 The inverted triangles depicting the transition from Industrial Age to Information Age medicine. The original image was used by Richard Smith in his 1997 BMJ Editorial discussed in this chapter. The version here was created by David Ingram (2010), CC BY-NC.

Richard Smith was a pioneering editor of the British Medical Journal (BMJ) in the 1980s and 1990s, and the first to focus on information as a key policy issue for medicine. In a notable editorial around the time of the NHS fiftieth anniversary, he set the scene for the transition of health care delivery in the Information Age.66 It is a complicated story with multiple interests and perspectives in play, each tending to require others to bite on bullets. The imagery of the inverted triangles (Figure 7.10), a version of which Smith used in his editorial, has echoes of Paul Tillich’s book, The Shaking of the Foundations, describing the upheaval in religious doctrine over the centuries since the Reformation;67 and the phrase ‘world turned upside down’, describing the era of Oliver Cromwell and transition from the divine right of kings to parliamentary government in the seventeenth century.

Regarding the biting of bullets, I recall the words of the memorable media celebrity Anthony Clare (1942–2007), who came to Bart’s as Professor of Psychiatry in the 1980s, in a discussion about service organization and budget with the august College Committee at Bart’s. He described the problems his department faced and made the case for more money. A senior and influential medic, defending the fiscal status quo, countered, saying soft but firm words to the effect that it is tough, but there is no money, and we must all ‘bite the bullet’! To which, in his mellifluous Dublin tones, Clare responded: ‘I recognize, in the clarion call of Dr …, that we should bite the bullet, the voice of one whose teeth will not be doing the biting’. Done so well that all sides of the Medical Committee broke into laughter, including said influential medic! It requires a particular strength of mind and timbre of tongue to speak truth to power like that, survive and stay on good terms–not an easy feat or common phenomenon!

Public sector policy makers who control purse strings have proven ill-equipped to manage the information policy domain and are left firing bullets to be bitten, in multiple directions. They have harboured often quite naive misconceptions about the nature and complexity of computational methods and the quality and robustness of computer systems. They have been too ready to place trust in commercial suitors eager for government money. Information services for health care were wrongly thought of, from the beginning, as a technical and routine commodity. They were seen primarily in terms of management and communication, not as an information utility and integrative ecosystem of wide reach: representing and communicating meaning and purpose among all concerned; facilitating their reasoning and enabling and justifying their choices, decisions and actions; capturing and recording data and workflow in all contexts of health care; supporting education and research; and managing personnel, materials, facilities, queues and money.

This change to an integrative perception of the information system as a utility, is as profound as William Harvey’s (1578–1657) de Motu Cordis (1628), where he postulated the circulation of blood in the body. It is as ground-breaking as Sherrington’s conceptualization of an integrative nervous system in the body, the foundation of neuroscience of today. What we are concerned with, here, is, by analogy, the integrative information utility of the body and mind of health care systems and services.

As previous chapters have demonstrated, established thinking has often been biased and protective in its metrics and judgement of innovation, perhaps especially so in areas of wicked problems, where stakes are high.

For example:

New systems are cumbersome to install and make use of. This is nothing new. The TIMES wrote in 1834 that it was unlikely that the medical profession would ever start to use the stethoscope, ‘because its beneficial use requires much time and gives rise to a fair amount of difficulties’.68

This is understandable as an expression of human limitation in predicting the future. The information policy response needs to be refocused away from controlling and regulating today, towards how to create a desired and sought for tomorrow. The perceived wisdom about failings in health care, when observed from high up, has been of poor or deficient management. This has been addressed by adding management. No one has pieced together the timeline of actions, monitored the changes, and remembered and learned from the outcomes. In politics, five years is the event horizon, although a week is also sometimes a long time. In science and engineering, implementation horizons can stretch three times that far ahead. In business and everyday life, the eyes are focused in the days, weeks and months ahead. Failed corporate memory and incommensurate timescales of ambition contribute to the wicked nature of the policy challenge, where problems can neither be fully understood nor fully owned.

Whitehead’s term ‘anarchy of transition’ headlined this second part of the book, and the transition, here, is the information revolution of our times. The information pandemic is a chaos, imposing additional burden on services and sucking away resources that they need to cope. Local IT services operate, in Francesca Wilson’s (1888–1981) phrase, ‘in the margins of chaos’.69 They are the ones having to bite the bullet and cope, because their life, generally, is hard! In her case–that of military conflict–it was especially true.

It is harder, though, to accept administrative chaos as being as inevitable as the chaos of war. In twentieth-century health care, it is not. It is the cloaking and denial of repeated failure. Few people, partly mindful of Raleigh’s caution at the head of this chapter, and partly out of genuine bemusement and confusion, choose to speak out, dare to speak out or can speak out. And those who do speak, often do so loudly, making assumptions and diagnosing the issues to serve narrow personal, professional and commercial leanings and interests. Political discourse at the top of health care services, and in its ramifications into education, law, commerce, finance and governance, has become a fierce and controversial domain, of local, national and international rivalries and interests, and contest for resources. Much of this now plays out in policies and markets for information systems and infrastructure.

Chaotic and unstable discourse, playing out in limit cycles of policy change, has resulted, time and again, in implementation failure, patched over with bureaucratic and political justification, obfuscation and amnesia. As an organization that prides itself on learning, the almost non-existent organizational memory of our NHS on information policy and implementation through these times, or even recognition of its lack, is woeful. People prefer talking to bullet points rather than biting bullets. They wring their hands, and cash registers ring. And still the ‘imagineering’ and throwing of billion-dollar dice persist and charge ahead. My songline has travelled through many of the connected worlds I have cited, here, and some may think I have misrepresented and traduced them. It is said and written without rancour and with good will. Personally, and thankfully, I have come out the other side, but many good people have not been as fortunate. Having expressed these feelings of dismay, here, in Part Three I will stop looking back and looking on, and start looking forward to what can and needs to be done differently in the future.

At this point, I will trawl through public policy and reports linking health care and information technology of the past fifty years and then revisit Illich’s perspective of fifty years ago, to draw conclusions about where health care services now are, and the direction of travel that future information systems and services should chart, in support of their reinvention, enablement and support.

Connecting Policy with Practice–A Fifty-year Timeline

If there were to be an informative poll of the costliest and most enduringly significant failures of public policy, internationally, of all time, my guess is that failure in health care information policy would rank quite highly. It might even prove an outlier. In recent years, Barack Obama was persuaded to invest heavily into improving standards of electronic medical records in the USA. He expressed the limited progress achieved as the greatest disappointment of the health reforms prioritized in his term of office as US President.70 The UK Parliament’s Public Accounts Committee has regularly called in the NHS hierarchy for a drubbing about IT, whenever they felt in need of some action with this all too easy target. It is regrettable that their spotlight is not also turned on the hubris and pretence exhibited in the political drivers of the anarchy.

Policy makers have sought to defend by distancing themselves from the arena. One tactic has been to relegate information policy to a lower level. In the UK of the 1970s, it was a matter for the supplies division of the Department of Health and Social Security (DHSS). Finding the issue coming back to the top of the pile, another has been to delegate or subordinate it–making it primarily a technical matter or a local responsibility, or a matter to be resolved within commercial markets. Repeating the exercise under new management and leadership, a third approach has been to call on independent review and advice, within or outside government, from a mix of disciplines and professions.

Some such consultants and reporters have their personal axes to grind or wear blinkers. They put a periscope above the water and peer ahead, beyond the waves to a new order–a brave new world of science and society, health and care. This is usually a twenty-year perspective, clothed in the buzzwords of the day–an intoxicating mix of opportunities and the changes implied in getting there for patients, professionals and managers. A parcel of recommendations is tied with ribbons of new information-led institutions and services, to be relied on to square circles, ensuring that expectations are met, quality ensured and efficiency achieved. Beyond the blue skies, sustained ownership and leadership of planning and implementation have been severely challenged and found wanting, in all phases of the implementation and review of such vision. 2020 vision should be better than this.

As Rittel and Webber’s characterization of the wicked problems of social policy has illuminated, health care information policy is a nightmare of changing and evolving, ungripped risks and complexities. Its implications extend throughout health care and in all their scientific, technical, social and economic contexts. A twenty-year timeline of radical change cannot be managed over an electoral cycle, within a domain and marketplace that can spin on a (Covid!) coin in three months. We are living through multiple spins of coins–politics and choices are 50:50 toss-ups and black swans are flying in from all directions.

A strategy for navigating this domain might best be seen through the guiding tenets of Sun Tzu’s two-thousand-five-hundred-year-old Art of War! In his almost poetic encapsulation of the nature and imperatives of battle, lie human objectives and insights, set out within the context of plans, strategies, methods, energies and the leadership, choice and adaptation of decision and action, through the unfolding engagement. I do this for fun and light relief, in Chapter Nine!

There are battles over information all around the circle of knowledge, and throughout the communities of health care practice–as academics, professionals, industrialists and citizens. Rational policy becomes intractable in the changing landscape of science and society, and of industries and professions. Alternating bullish hubris and air-brushed failure has been disabling at the coalface of health care services. But this anarchic situation will play out, one way or another, and good new approaches can take root and emerge from the confusion.

In thinking about how to describe the policy position we find ourselves in, one way to start is by observing how the landscape and its challenges have been framed in policy documents and legislation, historically. One way of recognizing, gauging and learning from failure is to search back into archives of past policy initiatives, to see what was achieved and learned. In policy for medical information systems, there is a fifty-year track that I will follow here. I have lived through it in both personal and professional contexts. Stacked around me as I write are what seemed key reports in their times, and the more recent ones are a few clicks away online.71

The following review tracks these documents and their contexts of health care IT since the 1970s, when such documents first came on the scene. A striking observation is that these have changed remarkably little over the intervening years. The descriptions of current realities, and what should be aimed for, when adjusted, and normalized for the language and context of the day, are recognizably the same. The implications and costs of failure are today more acutely characterized and impactful. They have evolved into a central crisis of health care services of the age. In the additional resources for the book, Appendix II,72 I have catalogued the key legislation, policy and organizational changes in the NHS since 1948, and this has helped to provide contextual orientation and trigger of my memory.

1967–The Flow of Medical Information in Hospitals–Nuffield Provincial Hospitals Trust

This short and succinct report described the flow of information within and between all departments of an acute hospital.73 It is the earliest such publication in my archive, from the time I entered the field and started collecting them. The Nuffield Provincial Hospitals Trust, subsequently the Nuffield Foundation, has taken a strong philanthropic interest in health care since earliest days. William Richard Morris (1877–1963), the first Viscount Nuffield, was the English Henry Ford of his era, as a pioneer of the UK motor car industry.

The report took the form of flow charts, which were seen as an essential first step in planning for automatic data processing in any part of the hospital’s information system, ‘so as to record the precise content of each item of communication and the responsibility for its origin’.74 The study was conducted in partnership with the English Electric Leo Marconi Group, prominent in early computer hardware design and manufacture, which was spreading its wings at the time into industrial automation and business data processing. It made a strong case for systematic study of information in its real-world context, as a basis for improving existing procedures or introducing new techniques.

1977–Policy Implications of Medical Information Systems–Congressional Technology Assessment Board of the US Senate

On 28 October 1977, Senator Edward Kennedy wrote to the Committee on Human Resources of the US Senate, submitting a report from the Office of Technology Assessment. The commissioned report was entitled ‘Policy Implications of Medical Information Systems’.75 The Office of Technology Assessment (OTA) was created in 1972 as an advisory arm of Congress, to provide wide-ranging reviews of policy issues. Its brief was well framed:

[…] OTA’s basic function is to help legislative policymakers anticipate and plan for the consequences of technological changes and to examine the many ways, expected and unexpected, in which technology affects people’s lives. The assessment of technology calls for exploration of the physical, biological, economic, social, and political impacts which can result from applications of scientific knowledge. OTA provides Congress with independent and timely information about the potential effects–both beneficial and harmful–of technological applications.76

This was the first governmental strategic review of the field that I encountered. I have emboldened the key issues it highlighted, which remain central concerns to this day, approaching fifty years on:

  1. The benefits and limitations of medical information systems;
  2. The factors influencing their adoption; and
  3. Policy alternatives for the Federal Government with regard to such systems.77

The team assembled to conduct the study came from across health care, academia and industry. Octo Barnett (1930–2020), the luminary director of the Laboratory of Computer Science at Massachusetts General Hospital and Harvard Medical School, was a notable member–alphabetically and in terms of his eminence, first among equals on the list. His name must feature in the top level of those whose practical and intellectual insights and endeavours defined and shaped the field over the coming decades.

A medical information system was defined thus:

A medical information system is defined as a computer-based system that receives data normally recorded about patients, creates and maintains from these data a computerized medical record for every patient, and makes the data available for the following uses: patient care, administrative and business management, monitoring and evaluating medical care services, epidemiological and clinical research, and planning of medical care resources.78

Acknowledging the fledgling status of exemplars from the field and the pace of evolution of systems and technologies, which made assessment of benefits and limitations difficult, the report cautioned that without a federal policy towards these systems,

[…] their diffusion may well proceed indiscriminately, and standardization will not be possible. If so, the full potential of medical information systems is not likely to be achieved.79

The eighty-page report was commendably informed, succinct and wide-ranging. Its findings regarding benefits and limitations covered the following: institutional delivery of care, support of clinical decision making and physician education, assessment of the quality and utilization of medical care services, malpractice litigation, roles of medical care professionals, health data systems (by which was meant a clinical data repository for study of population health and health services), planning and research, and confidentiality of patient records. Factors influencing adoption covered: acceptability to medical care providers, technical transferability, cost and wider contexts of technology development and federal policy and incentivization.

Regarding technical transferability, the report noted:

Prototype medical information systems have been proven technically feasible, but most have not yet been made adaptable to the various conditions of different institutions. In order to realize the benefits of a standardized database and to market systems economically on a large scale, flexible systems are required.80

The study was focused on direct patient care, saying:

The capability to accumulate and retrieve data for each patient is critical for both the process of patient care and research.81

Attention was thus paid to individual patients’ medical records, while acknowledging that:

[An important capability is to provide necessary data for] administrative and business offices.82

Successful federal support, from the mid-1960s, for the development and adoption of systems addressing business needs, was noted.83

The report also noted that early attempts in the 1960s to install integrated information systems in hospitals had proved costly failures. The common themes characterizing accounts of those failures were seen to be:

Inadequate understanding of the complexity and variations in medical care, inadequate computer hardware and software, and inadequate commitment of capital for long term development.84

The report described three prominent prototypical systems to illustrate potential implications for patient care and the whole health system. These were: the Technicon Medical Information System at Al Camino Hospital in Mount View, California; the COSTAR system developed at Barnett’s Laboratory of Computer Science at Massachusetts General Hospital, in use by the Harvard Community Health Plan, and the PROMIS system (Problem-Oriented Medical Records Information System, based on the original ideas of Larry Weed (1923–2017) for structuring medical records according to problems identified and tackled) in use at the University of Vermont Medical Centre. No attempt was made to survey the field and categorize systems by design and capacity. Attention was restricted to integrated systems rather than those dedicated to particular specialties or departments.

In their review, the team focused on:

  • Capture of data;
  • Provision to providers of care and administrative and business offices;
  • Administrative and communication functions: messaging among departments, scheduling of appointments, charging and billing;
  • Provision of database for investigators: quality of care assessment, clinical decision making, epidemiology, health services research and planning and evaluation of care.

They noted that, although now technically possible, no systems currently incorporated all four functions. They highlighted how the speed of access online and the ability to manipulate and analyze data requires careful structuring and definition of the database, as well as aggregation of ‘massive amounts of data on large populations for long periods of time’.85

In a section considering the variability of medical care, the report highlights variability of style, format and language of records–between institutions, and clinicians within institutions. The importance and additional complexity of handling free text was noted:

At present, lack of standardized nomenclature or established protocols in medical care continues to constrain the development of a generalised database.86

Another problem seen to be significant was that:

Because medical information systems have been developed through the independent efforts of many investigators, today’s systems reflect diversity of philosophies and technical approaches.87

The section on policy alternatives adduces a compelling rationale:

The federal government could continue current policies and allow adoption of medical information systems to be determined in the open marketplace. However, this policy could result in medical information systems being marketed and adopted without additional investment in research to improve certain capabilities. Because capabilities to improve and monitor the quality of medical care and to facilitate research and planning are the least developed and require standardization, these potential benefits for patients and the medical care system might be lost. Computer systems limited to administrative and financial functions could continue to dominate the market. Medical information systems that might be used could also lack high standards of quality or provide inadequate protection for the confidentiality of patient data.88

In arguing for Federal action influencing development, standardization and eventual use of medical information systems, a range of policy options was proposed:

  • Central clearinghouse to coordinate developmental projects and provide public information;
  • Funding for cost-benefit evaluation;
  • Contracts for design and development of systems with specified capabilities;
  • Incentives for adoption of systems that improve quality of care and support research and planning;
  • Central organization to develop, validate and maintain the knowledge content of medical information systems;
  • Standardized databases, to include nomenclature, terms, definitions, classifications and codes for use in systems;
  • Guidelines for precise standards to protect the confidentiality of patient data.

I’ve quoted this report at length because its early policy pointers, as highlighted here, have remained relevant and impactful to the present day. Failure to understand and take due notice, at all levels and in all sectors of health care services, has cost society hugely, in waste, burden and lost opportunity. If Barnett’s successors at Harvard Partners, led by Blackford Middleton, were anywhere near the mark in their assessment of the economic cost of this failure, it has amounted to a direct accumulating cost of eighty billion dollars per annum for the USA alone.89 That was a midpoint estimate, between 1977 and around the year 2000. Let us be extremely conservative and say forty years at that level, in present day money–not adjusting for inflation, that is. This is an eye-watering amount of several trillion dollars–an explosively combinatorial hit on just one economy. Slow adopters have been quite fortunate! It is hard not to suspect some meaningful correlation between this underlying cost and the outlying high costs of health care in America, as a proportion of GDP, connected with its commercial computerization.

1982–The Körner Report–UK Department of Health

Edith Körner came alone to England as a schoolgirl, in 1939. Her relatives that she left behind died in wartime extermination camps. She learned English and earned a living in monitoring wartime intelligence, using her fluency in Russian, German, Italian and French, and at the same time studying economics at the London School of Economics. She was, by all accounts, a formidable person in public life in Bristol and the Southwest of England. I’ve known people like her.

In 1980, she was asked to chair a review of health service information required to manage a district of two hundred and fifty thousand people. The Körner committee worked for four years and produced six reports. This was the first review of how the NHS collected and used data and set the scene for the coming decade of NHS Information Strategy. The scope was comprehensive and made recommendations for changes in information collected about hospital clinical activities and their patients, community health services, paramedical services, patient transport services and information about manpower and finance.

The goal they adopted was to devise a series of ‘minimum datasets’, providing basic statistics that every health service authority should have, to manage its health services effectively, to be collected economically, quickly and accurately.

The Terms of Reference were:

(1) To agree, implement, and keep under review principles and procedures to guide the future development of health services information systems; (2) to identify and resolve health services information issues requiring a coordinated approach; (3) to review existing health services information systems; and (4) to consider proposals for changes to, or developments in, health services information systems arising elsewhere and, if acceptable, to assess priorities for their development and implementation.

Many of the recommendations concerned data about individual patients but patients’ names were not included in the datasets, with the argument made that management use required the data in aggregated form. Improvement of methods for collecting, processing and analyzing the data, at local levels of care, was a major concern of the times.

1986–The National Strategic Framework for Information Management in the Hospital and Community Health Services–UK NHS

The scale and cost of the NHS had risen continuously over the previous four decades. Health and Social Security policy, legislation and resource were focused centrally within a London-based ministry–the Department of Health and Social Security (DHSS). Operational management was overseen and directed from the centre but devolved within five regions of the country, which were, in turn, subdivided into areas and districts. This was a massive brief and management challenge, awareness of its detailed context heightened through Körner’s intelligent, determined and trusted eyes.

A central NHS Management Board was established, reporting to a Health Services Supervisory Board operating within the Whitehall ministry. In subsequent decades, this was recast as the NHS Executive and numbers of separate and autonomous health-related agencies, with different coordinating focus, were established by statute–NHS Improvement, Care Quality Commission (CQC), and Monitor, among others. Responsibility for the education, training and regulation of the health professions resided with the separate General Medical Council and professional bodies, such as the Royal Colleges. These, in turn, had umbrella organizations for coordinating policy and practice. Other government agencies with generic responsibilities for oversight roles, such as the Audit Commission, retained their interest and power to act in investigating the health service.

It was a struggle to keep any head steady around the complexity that was unleashed. The power to set policy, legislate and control money was the only stabilizer of this unwieldy ship of NHS state, a super complicated supertanker. The idea that IT would help solve anything at that level was ambitious, but maybe just seen as bold and decisive! There were brave souls who saw this as their mission and chance, and first to the top of the tree was Mike Fairey, a hospital manager at the London Hospital, in Whitechapel, whose pioneering hospital patient administration system project had set the scene for what was to come in the wider NHS, as introduced earlier in this chapter.

He became Director of Planning and Information Technology, and set about the task of creating a National Strategic Framework to ‘make sure that people control IT rather than the other way around […] to collect data wisely and to apply information skilfully’.90 In four pages, seven annexes over a further ten pages, and a one-page action plan, the management vision was set out as follows:

Key issues:

  • Integration of information management within health care as a ‘business’;
  • Developing better systems;
  • Being more efficient;
  • Making health care more effective.

Framework:

  • Central policy and control constraining local implementation strategies, supported, and enacted through common technical standards and management information requirements.

The Annexes are a blur of clipped management speak and generality:

  • Annex One–the range of information;
  • Annex Two–deriving information requirements from service plans;
  • Annex Three–delivering information systems;
  • Annex Four–managing the key resources;
  • Annex Five–research, development and applications;
  • Annex Six–the use and supply of information;
  • Annex Seven–information management at the centre.

The document concludes with a one-page summary and timetable of seventeen actions for implementation, delegated to the five branches of a newly created Information Management Group, populated by NHS and Department of Health appointed staff. Politics of the era features heavily in these structures. The separation of family practitioner services (FPS) and hospital and community health services (HCHS) within the existing DHSS structures, led to some juggling of who controlled and did what.

The policy-related responsibilities and roles of the DHSS required that control and monitoring of NHS performance indicators, statistics and research should remain with the DHSS in Whitehall. Responsibility for what was now being identified and defined as corporate data management and, more specifically, custodianship of the Körner data definitions and creation of an NHS Data Model, came under a new grouping called NHS Corporate Data Administration. Development and implementation of common technical standards and systems throughout the NHS came under a new grouping called NHS Centre for Information Technology. Both these latter new groupings were established at a new home in Birmingham. Primary Care strategy and implementation remained in Whitehall, associated with the FPS, and pursued as a largely separate agenda.

The now acutely problematic separation of social care policy, practice and management from NHS services was reflected in this National Strategic Framework, from the start. I get a headache, still, today, when reading it. The authors must have been quite convinced by it as the timescales set for implementation were precipitate. Fifteen of the actions would lead to the final product within two years. Action 2 on common data standards was described as ongoing and Action 7, to produce a Common Basic Specification for NHS IT systems development was given three years. Given what subsequently transpired, the similarly set and similarly defaulted on timetables for implementation of the Information for Health policy, ten years later, and the NHS National Programme for IT, fifteen years later, are sobering.

Such documents are intoxicating to the heads that commission and write them and head-aching to those at the sharp end of what ensues in implementation and practice. The tone is magisterial and coolly declarative. Much about process management and cost, nothing much about content, leadership, outcome and value. They come across as written by ascending stars, struck by the magnificence and authority of what they were about to make happen in the world. These mostly burnt out as shooting stars, in the atmospheric friction of everyday health care delivery realities. Seen from the top down it must have come as a relief to ministers that this burgeoning problem was under such firm and decisive control. It was a Herculean vision and Hercules quickly looked around and passed the stone to Sisyphus.

The Roman Poet, Horace (65 BCE–8 BCE), put it quite well in his Ars Poetica (l. 138): Parturient montes, nascetur ridiculus mus [the mountains will go into labour, and a tiny little mouse will be born].

1988–The Common Basic Specification–UK NHS

And so was born the Common Basic Specification. In 1984, the NHS had made a fateful decision–to build a common data model that would be mandated for use by all its Health Authorities across the country. A first version was published in 1986–87 and was not received with any enthusiasm by those charged with maintaining management data in local health communities. Data modelling was already reasonably well-established practice in the industry, but the groundwork required for defining and validating a common and generic standard that would be useful in practice, however attractive as a concept, was lacking.

But the mood of the times was top-down control, and the failure to connect with data at a local level was attributed to incompleteness of the data model rather than that the plan itself was deficient. It was resolved to develop the model further by incorporating it within a generic model of health care processes, to which all local systems would be required to map. This was decided by the NHS Management Board in 1988 and made a main plank of the mission of the new Information Management Group.

Substantial resources and protected spaces were given to this group over subsequent years. Its main mission was seen to be the creation of a process model and mandating it into practice, rather than working iteratively, with feedback from practical implementations. It published hundreds of pages of detail, with the generic model mapped to different subsystems of health care management information. There was completely inadequate ongoing connection with health care delivery, to ground the work and establish whether the idea was feasible in practice and was useful in solving local problems faced in managing information.

To quote:

[The CBS is]

I was asked to be a member of a team established to review the work of hospital-based projects funded to implement the CBS. As far as I could observe and elicit information, there was no discernible link between the code they produced and the CBS models. It proved a task beyond them. Oracle received funds to implement it within their database technology, which I imagine they gratefully accepted, but also came up with blank connection of this work with practical health care systems.

There are some good bits in the CBS documents–the cartoons were quite amusing, but the jokes turned out to be on the CBS rather than on the clunky health care systems they depicted: NHS as a chariot with triangular wheels, CBS as the brain of the NHS, CBS as a pulley manipulating users into systems, CBS as a racing horse and NHS as a sickly camel. A doctor standing in the 1980s and peering through a telescope into the 1990s. Talk about ‘Imagineering’!

There followed many such attempts to shoehorn NHS operational data into manageable groupings with varying success at the coalface. Proposed groupings were sought, as idealized simplifications of real-life that could be used to compare the scale and level of service being delivered in different settings, such that resources could be allocated and managed on the basis of this ‘Casemix’, and performance assessed. Coding and classification of episodes of care, diagnostic and health related groupings became focal issues, and this area of work became embedded in routine reporting and central aggregation of data for the NHS.

The untidy nature of patient care, across different sectors and institutions, militated against tidy and useful data definitions. Those adopted became instruments of managerial mandate. Finished Consultant Episode (FCE), Health Resource Grouping (HRG) and Diagnosis Related Group (DRG) became axioms of how health care worked and how its management data should be aggregated. This added considerable back-office burdens, and unknown benefits to the health of the nation.

1991–The Health of the Nation–UK Department of Health

Years of preoccupation with management of the NHS were seen to have become detached from strategic focus on the NHS’s primary purpose, to maintain and improve the health of citizens. It was also seen to have added a burden to local service managers and clinicians, to the detriment of the services themselves. Thus was born the government’s magisterially titled publication, ‘The Health of the Nation’.92

The emphasis switched to public health and balance of prevention, treatment and rehabilitation, following the differentiation employed by Beveridge in the early post-war years. It placed the lifestyle of the citizen centre stage and, likewise, the importance of education and information to guide the choices they make. It kept faith with the importance of good management but refocused its mission towards setting objectives and targets for improvements in health. For example, on page 13, there was an obesity target! How these targets should be prioritized, set, and monitored, to be effective, was a major part of the consultation set in motion.

Recognizing the wide range of influences in play, the report acknowledged the need for widely shared ‘ownership’ (the report’s quotation marks) of plans and implementations, at all levels. The need for ‘better ways of monitoring and assessing health and measuring the effectiveness of interventions and monitoring their achievement’ was emphasized.93 The need was recognized for the NHS to function better as a ‘head office’ and for the Department of Health to work with the NHS and bridge to other Government ministries with key roles in this wider strategic framework. All very top down, as ever.

1993–Tomorrow’s Doctors–UK General Medical Council

This document presented findings of a wide-ranging national review of medical education.94 This had long been divided into a basic science component and a clinical component organized through attachment to clinical teams in different specialities.

The undergraduate, postgraduate and continuing education and training of doctors, nurses and paramedical professionals has profound implication for NHS workforce planning. The accreditation, registration and regulation of doctors for professional practice lies with the General Medical Council and Medical Royal Colleges. Different levels of registration occur after several years in professional employment, after undergraduate study, and medical schools have close links with hospital Trusts where graduating students move to complete these preregistration years of practice. Seeking to balance the supply of graduating students with the availability of supervised preregistration positions, and a flow from these into more senior positions within the NHS, numbers of medical students and thus of medical schools are nationally overseen and mandated.

This influential review took a new look at the knowledge, skills and attitudes required of a practising doctor in the late twentieth century. The curriculum had become more and more densely populated with science and specialism, and a fresh approach was sought, to achieve better integration between the scientific and clinical practice components and the assessment of both knowledge and skills in the examination system. There had been several decades of increasing specialization in the treatment of disease, in which each new and expanding field was eager to have its contributions recognized in the formal curriculum, and, of course, receive the associated student fees. The changing balance of physician, surgeon and imaging and laboratory specialisms was a part of this evolution, and general practice was gaining status, as the health service increased its focus on the role of primary care. Costs were extremely high, reflecting the extended seven-year curriculum through to first GMC registration to practice.

Implicitly, wider issues of competence and accountability in clinical practice were surfacing, questioning the extent to which these were rightly the sole preserve of clinical professional organizations and their judgements. The curriculum was widening its coverage of ethics and law, but the GMC document mentions ‘information’, only in passing. The impact of information technology was perceived mainly in the context of clinical skill in managing the knowledge base of medicine and computer-based learning. This was worrying evidence of a general failure of insight about the central importance of coherent electronic care records, for clinical method and practice. Regulatory bodies must play a more central role in their design and implementation and adjust their focus to be cognisant of the crucial roles they can and must play in the future, in education and training, and review of competent practice.

1994–Peering into 2010–A Survey of the Future of Medicine–The Economist

In March 1994, the widely read and influential weekly journal, The Economist, published ‘A Survey of the Future of Medicine: Peering into 2010’.95 The message was optimistic that ‘new technologies are set to transform medicine, eradicate most disease, and hugely improve people’s health’.96

I interpose it here as it lies halfway in time between the US OTA Report of 1977 and the Eric Topol Review of 2019, which is the last of the UK government reports I review in this section. The image of the then current reality of the mid-1990s that it depicted was a close match to the image of the mid-1970s presented in the OTA Report. The image of the prospective reality, looking twenty and more years ahead, as seen through The Economist’s telescope of 1994, bears striking resemblance to that seen in the Topol telescope in 2019. It would be a bit depressing for astronomers if their images of the universe were as unchanging as that!

Taken together, the three reports are close in the way they highlight the key challenges posed by poor quality and coherence of care records and population-based information systems, and the potential to be realized with digital records. Each survey recognized, up front, that improvement in this area was a sine qua non of achieving the wider improvements and benefits they expected to see. They agreed, very largely, on why this was important and the difficulties it posed. None had anything substantial to offer by way of how it was to be achieved. Arguably, Barnett’s team was more clear-sighted–he was both clinician and engineer and a Harvard Professor in both domains.97

Running to twenty pages, The Economist report took apart the efficiency, effectiveness and professional domination of ever more costly contemporary medicine in the USA, piece by piece. One page is devoted to the perennially and pervasively ‘poor medical record’, giving a passing plug for the ‘ongoing work of Advanced Informatics in Medicine Initiative (AIM) in Europe’where the GEHR project team had just published its first health record information architecture. In its place, it assembled an edifice of automated systems and customer-focused, managed care, integrated in an all-embracing information network, capturing data, both little and large.98

Prototypes surveyed ranged over tele-presently operated, image-guided interventions and robotic surgery, working less invasively, and more precisely and safely, and gene therapy. Cystic fibrosis and other single gene defects would, it expected, have been cured, twenty years hence, by 2015, and longevity gene drugs would be licensed by 2020. Jumping to 2050, cancer, heart disease and other serious diseases would have been cured by 2040, and most serious disease by 2050.

The survey was also cautious to some degree, but optimistic: ‘There will be upheavals along the way; there may be resistance from medics or others with an interest in stopping change. But the concomitant health gains will be so great that such obstacles are bound to be overcome’.99

An industrial model of health care delivery pervaded, exuberantly and overpoweringly, throughout the report, as it does in in many places still, today. Thus, we read that ‘If health care systems are to be made more efficient there must be some way of measuring their input (sick patients) and output (cured ones). To gather this sort of information, a patient’s welfare has to be tracked from medical records, data must be pooled and processed, and the outcome of any treatments must be monitored’.100 Touching on issues of privacy, it quotes a Stanford professor’s judgement that the ‘level of security provided by electronics is now ten times better than by hospital manual records today’.101 The recent experience of extensive and sustained hacking of US Federal Government data via the SolarWinds Orion software used to monitor networks counsels continued vigilance on that score.

On the downside, slow social adaptation to the pace of technological change and the potential for harmful genetic mishaps, through germline gene modifications, were cited. The report concluded, thoughtfully, as follows:

Putting concerns about privacy and the ethics of human genetic engineering aside, the biggest worry may be ‘Humanities inescapable triumphalism’. This says John Maddox, editor of Nature, is what accompanies a rush of discoveries that leave the impression that scientists know much more than they really do. New technologies are adopted with wild enthusiasm, even when they need a lot of further work. This time, though, science is being cautious. New regulatory bodies have been set up to oversee genetic engineering. New medical products cannot come to market without undergoing rigorous testing–though there may be a case for broadening the tests’ criteria […] Although many new technologies raised tricky medical, ethical, and social problems, they can be managed with legislation and with the right regulatory constraints […] Given this, it is hard to see why anyone should reject the opportunities that new medical technologies are likely to offer. The reward, after all, could be a guaranteed hale and hearty future for all.102

One looks forward to The Economist’s retrospective view of its telescopic predictions from twenty-five years ago!

1995–Setting the Records Straight–A Study of Hospital Medical Records–UK Audit Commission

The NHS, at a senior level, was becoming more aware of the central role played by medical records, in relation to ensuring the quality and efficiency of care and in keeping track of methods adopted, resources employed and outcomes achieved. The Data Protection Act of 1984 (another eery Orwellian 1984 coincidence), the Access to Medical Reports Act of 1988 and the Access to Health Records Act of 1990 defined new legal rights and obligations in relation to access to and safekeeping of patient records. The changing social context of health care services and professional accountability were given a new context by the Patients’ Charter setting out what patients should have a right to expect from the NHS, which was promulgated by the Government in 1991 and revised in 1995 and 1997. It was supplemented by the NHS Plan of 2000 and replaced by the NHS Constitution for England, in 2013.

The extent to which clinical services required the management of information, in one form or another, was realized to account for a significant proportion of health care expenditure–variously estimated to be between a quarter and a third. Medical notes had for years been perceived as outside the scope of management information systems for health care, connecting with them principally through secondary use extracts of their coded clinical data. But greater detail of what was being done, by whom, where, how and why, and with what outcome, was increasingly seen to be of primary importance for management, in both its clinical care and business-related aspects. Medical notes were still typically paper-based and carried around in huge piles by busy team members, with dictaphone in hand, reeling off cassettes of letters to family doctors, dictated after outpatient clinics or when in-patients were discharged home or to convalescence and social care services, sometimes to be typed up weeks later, in typing pools overseas.

The transition from paper-based to digital records was rising in importance for improvement on all these fronts, as well as support for population health and research. The report103 sought to balance issues of ownership, duty of care and access by patients, guided by emerging principles of personal information confidentiality, framed from the Organization for Economic Co-operation and Development (OECD) published guidelines. The recommendations were brief and generic: sort out immediate problems before attempting digital methods; experiment with patient-held records; research new technologies; NHS Executive to establish advisory service covering research outcomes and best practice. Not exactly SMART objectives–Specific and Stretching, Measurable, Achievable and Agreed, Relevant and Time-bounded. But SMART methods are not smart for tackling wicked problems like this, as we come on to in Chapter Eight.

1996–Seeing the Wood, Sparing the Trees. Efficiency Scrutiny into the Burdens of Paperwork in NHS Trusts and Health Authorities–UK NHS

A new word was surfacing in the management lexicon of health service management, that of burden. This report, commissioned at ministerial level, set out to balance the value of paperwork, as expressed by the parties involved, against the effort involved in creating, collating and handling it.104 Records and communications integral with patient care, information about the NHS as a whole and information that assisted management of the effective use of public funds were seen as more valuable. Unnecessary bureaucracy was seen to be often associated with poor relationships between organizations, poor quality of information and disputes. It noted marked variation across the services in these regards, with much good practice evident in organizations that were working well together.

Ways to make immediate improvements were foreseen, as were longer-term improvements that could be tackled through promoting a better quality of relationships and the use of information technology. Structural changes instituting new contracting arrangements for provision of care services had been introduced five years earlier and, in 1996, the five NHS Regional Health Authorities were disbanded, and new area-based Health Authorities created, with wider responsibilities for integrating health services across their communities. Formerly distinct Family Health Services and District Health Authorities were merged, giving opportunity for streamlining of management information flow across the NHS and in its relations with the Department of Health.

The report concluded that unnecessary bureaucracy was a systemic and cultural ill and the responsibility of all parties to work on and improve. The overlapping of communications from the NHS Executive and the Department of Health was criticized. There was an urgent need to simplify and reduce amounts, duplications and complexities of management reporting, and make greater use of operational data, rather than add additional information gathering tasks. Each new policy proposal should be assessed for its likely impact on the administrative burden on the NHS. Greater investment in IT and prioritization and streamlining of IT procurement and the twenty-five central Information Management Group development projects were recommended. NHS-wide networking was identified as the highest priority.

The review team recorded:

What has influenced us most is the compelling evidence that, where the various parts of the NHS have developed more mature relationships, rooted in trust and openness, where sensible judgements and decisions are made in partnership and cooperation, paperwork can be kept to a minimum.105

One reflects that disjointed goals and objectives, mutual rivalry and distrust, and narrow self-interest create unhelpful and unfruitful burdens both operationally and in legal matters within health care services and at their interfaces with the communities they serve and the industries they draw on. This, in turn, reflects an anarchic culture of social transition into the Information Age. It is ever more urgent to focus, in the way this report did, on longer-term efforts to overcome the fragmentation of information within non-communicating silos, which adds directly to information overload and burden.

1997–The Future of Healthcare Systems–Information Technology and Consumerism will Transform Healthcare Worldwide–BMJ Editorial, Richard Smith

This editorial of 24 May 1997 was a tour-de-force of radical journalism.106 The topic of the day was inspired by a US thinktank organized by Andersen Consulting, where twenty-five people from across the world, including the BMJ editor, had debated how the world’s health care systems might develop. The urgency brought on by unsustainable current models and unpredictable futures convinced all that major change was imminent and would proceed for decades. Issues of cost, complexity, pace of change and changing consumer focus, as well as issues of choice and personal responsibility, and advancing science and technology, were voiced in the different experience of participants from all over the world. The model of Singapore reflected personal responsibility of citizens, mandatory saving for health and co-payment of costs, resulting in only three percent of GDP devoted to health. The model of the USA, a mix of private high-quality care at the top, social insurance-based managed care plans in the middle and lower quality, government-funded care at the base was costing nearer to twenty percent of GDP. The NHS and Sweden represented ‘socialized medicine’ and insurance-based systems from other countries.

The group was challenged by ideas forecasting the overturning of traditional models and relationships in the Information Age, achieving a government regulated, ‘anywhere, anytime’ network of providers, suppliers, funders, insurers and consumers, with consumers playing a more central role. The methods evolved to monitor and manage quality, mix and cost of care, within managed care plans, and the information utility available to inform, guide and be shared among consumers were seen as likely to be applicable within all services, whatever the model for funding. The message of the final paragraph was change is coming, ‘you ain’t seen nothing yet’!107

1998–Information for Health–An Information Strategy for the Modern NHS 1998–2005–UK NHS

Next along the line of policy statements came the most encouraging that I encountered, sadly including some that turned into the greatest disappointments as they played out, dissolving or running into the ground. They networked, booked, scheduled and summarized, but did not integrate services and care records in the way they set out to achieve. On numerous occasions, I put the then crystallizing openEHR vision to the leaders and teams, from top to bottom, but we had little to demonstrate of implementation at the time. I suggested from the start where the Achilles’ heel of method for standardization of systems risked crippling the flow of information. Maybe they did not understand, or did not believe, or listened to and were reassured by more powerful voices who told them that they had the problems taped. Not so, it turned out.

With the change of government in May 1997, the NHS, and its progress in employing information technology, came under renewed scrutiny. A commitment to modernize and improve health care was expressed in a December 1997 White Paper publication, The New NHS: Modern, Dependable, and a Green Paper, Our Healthier Nation.108 These set out a ten-year programme to rebuild the NHS as a modernized service that is:

  • a national service;
  • fast and convenient;
  • of a uniformly high standard;
  • designed around the needs of patients, not institutions;
  • efficient, so that every pound is spent to maximize the care for patients;
  • making good use of modern technology, and know-how;
  • tackling the causes of ill health as well as treating it.

Associated with these, a further White Paper, A First-Class Service: Quality in the New NHS, was published in 1998, setting out a ten-year plan for securing quality improvement in the health care system.109 Here were announced: NICE (National Institute of Clinical Excellence), Commission for Health Improvement, National Service Frameworks, Primary Care Groups and the concept of Clinical Governance. In professional terms, much of this reconsideration and innovation did bed in successfully and effectively.

The heady wider ambition was followed up in September 1998 by the publication of Information for Health, An Information Strategy for the Modern NHS 1998–2005, written by a new Head of NHS Information Management and Technology (IM&T), Frank Burns.110 He had run IT at the Burton NHS Trust and knew what he was talking about. The central goal was that of supporting integrated care through NHS-wide standards and infrastructure. The document rehearsed the change of emphasis, from a strategy centred on management of care through an internal market, embodied in contracts between purchasing and providing organizations, to one centred on partnerships and performance. National benchmarks would be set for the quality and efficiency of services, in supporting individual patient care, enabling of public health improvement and provision of information to meet the needs of patients and the public. It was a refreshing and persuasive shift of emphasis.

Modernization of the NHS became a buzz-phrase, more widely, and a national Modernization Agency was established, mirrored in boards established at the regional level, to encourage and foster the adoption of redesigned service delivery, adjusting to changing needs. My Medical School Dean at UCL nominated me to serve for an interesting period on the London Modernization Board, chaired by the dynamic Professor of Surgery and pioneer of robotic surgery at Imperial College, Ara Darzi, meeting there a wide range of committed people and teams across the capital.

The Information for Health (IfH) strategy announced it was committed to:

It affirmed that:

The principles on which this strategy is based are:
  • information will be person-based
  • systems will be integrated
  • management information will be derived from operational systems
  • information will be secure and confidential
  • information will be shared across the NHS.112

The author had come to the fore in his leadership of a Trust where progress had been towards clinically-focused and integrated systems. The document set out his ideas for how this new national strategy could be implemented at local level. It set out two-, four- and seven-year targets up until 2005. He approached the challenge, as others had before him, emboldened by the certainty that his prior experience gained in what he had led locally, provided an implementable global blueprint. Sadly, and ever more expensively, such confidence proved once again to be unfounded.

The first chapter of the publication considered support for direct patient care. It argued for two distinct but mutually integrated kinds of record. The Electronic Patient Record (EPR) would record ‘periodic care provided mainly by one institution’113–typically an acute hospital but also specialist units and mental health NHS Trusts. The Electronic Health Record (EHR) would provide ‘a longitudinal record of patient’s health and health care–from cradle to grave’.114 Given the defining mission of the NHS, as a universal service, free at the point of delivery, this was an extremely high level of ambition, and so it proved. It was captured in the publication’s Figure 3, which shows hospital, social care, community service and mental health services records all feeding into a ‘Primary Care Electronic Health Record’.115

The chapter rehearsed gaps that currently limited and inhibited progress towards this goal, starting with Primary Care records:

It is essential that health care professions agree the nature and content of the component datasets so that a consistent model of EHRs can be constructed.116
Currently there is no agreement on either the content, structure or potential use for patients, clinicians, public health specialists and planners of individual personal summary health records. The NHS must consider these issues in the context of developing integrated electronic records in Primary Care.117

The importance of protection of privacy was highlighted.

Moving on to consideration of the EPR and its role in provision of integrated care:

It is essential to create and maintain accurate, complete, relevant, up to date and accessible EPRs.118
As a minimum, coordination of care must improve across the following organizational boundaries: within the full primary care team, between hospitals and general practice, between health and social care.119

The report acknowledged the trailing-edge state of information technology in ‘most of the NHS’ and the partnerships, teamwork and funding needed to match the ambition of the strategy. It identified the ‘lack of a common primary care record structure’ and that GP, community and mental health systems were ‘proprietary systems with hardware and software which is incapable of coping with sophisticated EPR functionality’.120

Regarding EPR systems, the report concluded that ‘The NHS simply cannot sustain the present disparity in the level of information systems support to clinicians and must set a minimum level of development across the acute sector’.121 A six-level model was proposed, with functionality of increasing breadth and sophistication.122 The action target was for level three functionality (supporting clinical activity such as placing clinical orders, results reporting, prescribing and multiprofessional care pathways). For this work to progress, a consensus was required ‘on the content, structure and use of EHRs, with the health professional and managerial community, involving the views of patients, carers, and the public whom they serve’.123

Several new bodies were established to oversee the programme, including a Clinical Data Standards Board. This was led with great determination and skill by my clinical colleague, Martin Severs. Section 3.6 summarized current issues and problems in the support of integrated care–it may sound like a stuck gramophone needle, here, but the echoes back to the OTA Report in 1977 are so strong that they need emphasizing like this:

[These were]

Staff from across NHS centres, including for coding and classification and Casemix, were deployed to implement a clinically led Clinical Information Management Programme:

[… inheriting] existing work programmes covering:
  • clinical headings and definitions
  • clinical terms and coding classification
  • Casemix development
  • clinical messaging standards
  • condition-specific clinical data sets (e.g., for the cancer information strategy)
  • standard clinical record structures.125

The strategy set out national goals, identified central players and allocated resources, responsibilities and timescales. It focused on matters of who, when, where and why, but passed on the most fundamental question that subsequently bedevilled its implementation in real life, at the coalface of health care services, clinically, technically and organizationally. A simple question, unanswered: how? How concerns method–in this context, rigorous, clinically owned, implementable and trusted method, supporting design, development, procurement, operation and sustainability.

It waved its hands over the central importance of standards: but which standards, and how they are created, sustained and applied? It waved its hands over confidentiality: but how is this regulated and enforced through design and operation of networks and systems? It waved its hands over public information. All no doubt considered as just matters of technical detail, within the grand scheme of things–at these heights, devils were elsewhere than in the detail!

There was a lack of realism about the scale of the problem and the challenges involved in transitioning to scale from prototype to a nationally integrated and trusted system. There was a lack of understanding of the accelerating evolution and growth of the Internet and World Wide Web, making the issues facing health care progressively global issues in global marketplaces. There was a lack of hands-on sense of the complexity and vulnerability of existing legacy systems, where simply maintaining them in day-to-day use was often a full-on challenge for local teams and suppliers, let alone revamping or replacing this legacy, to meet new requirements for integration within a common national framework and infrastructure.

This situation played out in a mismatch of the goals and capabilities of the NHS Information Authority in Birmingham, which was charged with making the strategy implementation concrete and coherent, and the local IM&T teams and software suppliers, who were tasked with keeping things running as they were, and integrating the necessary changes to align with the national steps towards the modernized national infrastructure. The service at all levels faced conflicting and complex pressures and demands, from all directions–from top down, bottom up and in the compressed middle: in Trust boards accountable for local services and answerable to local communities; in higher-levels of NHS management responsible to ministers; in local IM&T teams answerable to Trust Boards for maintaining local systems and managing relationships with suppliers.

Symptoms and breakdowns abounded. At the local level, a glimpse under the bonnet of the churning computer software revealed and reflected the intractability of the How question–string and sealing wax patches to creaking byzantine code deployed on ageing and incompatible technologies; inability or slowness in making any changes to respond to local or changing needs; IT professionals struggling to survive the stresses and strains. I knew and worked alongside the good and dedicated people struggling and coping at all these levels.

At the national level, the lack of provenly implementable technical and clinical standards on which to base the integration of systems proved its Achilles’ heel. Standards governing the terminology and structure of records, and electronic messages passing information between systems, were aspirations and works in progress, internationally, not at all rigorous methods that could be relied on to exist and be fit for purpose, as the handwaving of the strategy had rather assumed.

Computerization relentlessly exposes weak assumptions, at all levels, where unproven methods–assumed and sometimes promised by the industry to be straightforward, in turn reflecting the struggles they experience–are invoked as solutions to unsolved and intractable problems. The achievement of consensus around a core method for defining implementable clinical data standards and the safeguarding of confidential personal data also proved long and exploratory processes. Faced with unachievable target dates and expanding workloads, the first three years of implementation of IfH implementation came to a crisis.

Thus was born the National Programme for IT (NPfIT), launched in 2002, at what was supposed to be the weigh-point of the second two-year phase of the original seven-year IfH implementation. This programme, subsequently renamed Connecting for Health, marked a new stage in learning about standardization and implementation within varied local legacy contexts. Political agreement was sought and agreed, at Cabinet level, for additional investment of central government funds, on the condition that the goals of Information for Health should be achieved within the coming three years. It was a guessing game, once again. With the Treasury more fully engaged, money flows were multiplied into billions, conditional on the acceptance of a centrally mandated strategy of regionally aligned procurements of systems.

Advised by leaders of the industry and commerce of the time, the story changed to one of central command and control of a limited set of systems implemented to operate right across the NHS, their performance enforced by legally binding and tightly managed contracts with suppliers. A new leader was appointed–from management consultancy, this time–to bring proven skills in large-scale IT contract management. This was Richard Granger. His leadership style was north country direct and south country rough and self-assured; I did not see much of him, but he was different, and I quite liked him. There were many hopeful industry participants–some of them heavyweight newcomers attracted by the money potential. Existing suppliers formed consortia along with new suppliers and consultancies, to bid for huge contracts. Major primary care system suppliers managed to face down the contract terms and stay out of these sorts of arrangements–apparently rather heavy-handed attempted coercion, notwithstanding. A well-regarded and capable consultancy company was appointed to create a central Design Authority to underpin the programme.126

In these contracts, the money was certainly big, but the supposedly legally binding nature of the accompanying commitments on performance proved a mirage. There was a lack of grounded sense of the capability and experience of the supplier community. The order of the day seemed to be: make the contracts ‘water-tight’ and all will be well. They may have seemed water-tight but lawyerly adversaries, loyal to their clients, delight and drown in words and loopholes. They were certainly not ‘costly hot-air-tight’, either! At one stage in the ensuing chaos, I heard it said that if current suppliers could not match the moment, the NHS would specify and commission its own, wholly new system and have it developed in India! Bravado, delusion and folly, all in one!

This era was distinctly pricey and decidedly dicey! Leaders of the hospital systems industry were, however, delighted, and magnanimous in their lucrative victory. I met one of the leaders of HL7 at a party in London launching one of the major projects. The belief that the HL7 v3 standard was up to the challenge of integration of these systems was the orthodoxy of the day. He sipped from his glass of red wine and murmured, ‘If I were a British citizen, I’d be very worried about all this!’

New infrastructure–including the NHS-wide network and Spine for message communication between systems, e-prescribing, the booking system for appointments and summaries of care records (as a small first step towards the aimed for EHR)–made slow but steady progress. Integration more widely within the five regional consortium contracts for secondary care systems, proved a major disappointment and failure, bringing numbers of Trust operations close to collapse, and the NHS to costly settlement in battles with suppliers that they lost. Proudly characterized at the outset as the largest public IT infrastructure project ever attempted, it ended its days, as characterized after a subsequent Parliamentary enquiry, as the greatest computer procurement disaster of all time. The programme did, however, establish a baseline infrastructure that has endured and improved communication of data between different health care services and institutions. From a public perspective, the value for money achieved and all the surrounding disruption, were not good.

The gap and mismatch of the consortium capabilities and products, with the problems and needs they were intended to address, and their implementation, were terminally too great. Smooth and highly paid consultants with next to no grounding in health care, meeting battle-worn IT team members in Trusts to put them straight and bring them into line, was not a comfortable scene. I saw this first-hand when attending board meetings, representing UCL and local Trusts, in local and regional meetings. The saga played out with bombast and threat over five years, and continuing failure, recovery and litigation over another five years. It was a noisy scene, protected from bullets for the usual five- to seven-year cycle of politically driven projects, and its leadership ultimately assured only its own self-destruction. There were big fights over blame and compensation, of course, and the NHS came off badly. There is forewarning, in this experience, of how the trading of NHS data in return for commercialized artificial intelligence might play out into the future.

Politics after this latter period became enmeshed with the global collapse of financial markets, consequential on instability introduced into the management and mismanagement of money in the world economy of the Information Age. The NPfIT and CfH initiatives were roundly derided as having proved unfit and disconnecting for health. More truthfully, they exposed and reflected weaknesses of method and capability in providing and integrating useful electronic health care records. The priorities set by Barnett’s team in 1977, thirty years before, remained a grand challenge. In another time, Fred Brooks could have had no more consequential an example and case study for his book The Mythical Man-Month about the prime importance of architecture and architects.127

And as AI raises its head, new partnerships bring premonition of Faustian pacts, with the NHS as Faust, surrendering health care data in return for knowledge and power over its use, which Mephistophelian big industry promises in return. No doubt overly dramatic, but there is good reason to be clear and cautious.

2002–Securing our Future Health–Taking a Long-term View–The Wanless Report, UK Treasury

After Prime Minister Tony Blair initiated NPfIT, from 10 Downing Street, Chancellor Gordon Brown, across the road at the Treasury, decided to commission a long-term review of health care. There was much tense jostling of these personalities and their teams, at Cabinet level! The review was conducted by Derek Wanless, a former banking chief executive.128 His impressive command of detail heralded a substantial report, which he presented with clarity and authority. As a Treasury report, the focus was on financial implications for the service as a whole. In this regard, it highlighted low historic focus and investment in health care information and communication technology (ICT) as a major issue for government.

It was a hopeful breath of fresh air and projected a twenty-year forward view, as many such reports have done. That is now, for this report! I quote here extensively from figures given in the report as these are likely to have been best estimates.

An Interim report outlined the Review’s three-stage approach:

  • Stage 1: to understand what patients and the public are likely to expect from a comprehensive, high-quality service available on the basis of clinical need and not ability to pay, in 20 years’ time;
  • Stage 2: to map the likely changes in health care needs, technology, and medical advance, workforce, pay and productivity; and
  • Stage 3: to assess how these changes will affect the resources required to meet patient and public expectations.129

From Section 4.3:

The health and social care asset base is huge: there are over 1600 NHS hospitals in the UK. There are around 10,500 primary care premises. The combined value of this asset base in England is estimated to be over £25 billion; the value of the social care asset base in England is estimated to be around £13.3 billion.130

From Section 2.5:

The interim report outlined what the review believed patients and the public will expect from the NHS in 2022: safe high-quality treatment; fast access and integrated, joined up system; comfortable accommodation services; and patient centred service.131

Looking to 2022, this would mean:

Modern and integrated information and communication technology (ICT) is being used to full effect, joining up all levels of health and social care and in doing so delivering significant gains in efficiency. Repetitive requests for information are a thing of the past as health care professionals can readily access patient’s details through their Electronic Health Record. Electronic prescribing of drugs has improved efficiency and safety. Patients book appointments at a time that suits them and not the service.132

Section 2.15 describes ‘a new “whole systems” relationship between self-care, primary, secondary, tertiary, and social care’,133 while Section 2.24 examines the then reality:

The health service makes very poor use of ICT. There are examples of successful use of ICT at local level, but systems have typically been developed and installed in a piecemeal fashion. This prevents the effective integration and sharing of information across a wide range of health care providers.134

From Section 2.27:

A safe system is an integrated system where there are effective links and good communications between different parts of the service and beyond. This was highlighted by many respondents in consultation, who especially pointed to problems in social care impacting on the effectiveness of the NHS.135

And from Section 2.32:

At the heart of the 2000 NHS Plan’s quality strategy is the development of National Service Frameworks [NSFs] which set out national standards for catching up to a high quality, integrated service in key areas, initially coronary heart disease, cancer, renal disease, mental health, diabetes, older people, and children.136

From Section 2.35:

The NSFs [National Service Frameworks] aim to reduce health inequality by improving access to care for those most in need and currently least likely to receive it. A range of sources suggest that, although need for treatment often increases with the level of deprivation, chances of receiving treatment decrease. This so-called inverse care law is likely to be the result of people from lower socio-economic groups having less access to care facilities, presenting at a later stage of disease development and being less demanding of medical professionals.137

The report presented three scenarios of implementation of change, characterized as: solid progress, slow uptake, fully engaged. With regard to self-care, it says:

Increased self-care, and the more aware and engaged public associated with it, could result in useful cost-benefits for the health service both in terms of levels and effectiveness of resources, arising from more appropriate use of health and social care services.138

With regard to genetics, it considers the impact very uncertain and concludes it is unlikely to be large by 2022.139

With regard to ICT, it remarks that expenditure per employee is the lowest of any sector of the economy. It expects expenditure on infrastructure, electronic patient records (EPR), telecare for chronic conditions, clinical governance support, and training to double–quoting US projects projecting savings in other costs.140

In Section 3.74, there is a massive and fateful caveat:

How effective this investment proves in delivering a higher quality, more responsive health service and in reducing costs will depend on the quality of implementation [my emboldening]. In particular, it will depend on the extent to which the investment takes place in an integrated manner with consistent standards across the whole service.141

The report’s fifth chapter considers resource implications in three scenarios: ‘solid progress’, ‘slow uptake’ and ‘fully engaged’. It expected health expenditure to rise from 7.7 percent of GDP to 10.6 percent, 12.5 percent and 11.1 percent respectively in the three scenarios (solid progress, slow uptake, fully engaged). This represented the NHS budget rising from sixty-eight billion pounds to one hundred and fifty-four billion, one hundred and eighty-four billion and one hundred and sixty-one billion pounds, front-loaded for change over 2002–08.142 The report notes that historically the current figures have been 1–1.5 percent lower than the EU average since 1972. In Box 5.1, it warns that spending does not guarantee outcome, quoting comparative figures from Sweden and the USA in relation to life expectancy.143 In relation to ICT budgets, it expects these to double to 1.2 billion pounds per annum. In relation to social care (Sections 1.2–2.1, 2.5, 2.0–2.7, 2.9, 3.4 and Chart 5.8), it shows costs doubling between 2002–22 from 6.4 billion pounds to 11 billion pounds. This does not include cost implications of their quality improvement. In Section 5.58, it makes a strong statement about health and social care, as follows:

Health and social care are inextricably linked [my emboldening]. There are many interactions between the two sectors. For example, recent increases in the number of older people being admitted to hospital in an emergency partly reflect reductions in the availability of appropriate social care. In planning the delivery of care, health and social care must be considered together in order to ensure that both provide high quality services for the individuals receiving care and make efficient use of resources.144

And in Section 5.59:

this demonstrates the need for a greater focus in future on whole systems modelling to help provide a better understanding of the interactions between health and social care and the implications for the level of resources required.145

The sixth chapter of the report focuses on the effective use of resources. The first area identified is: ‘Setting national standards for clinical care and an integrated ICT system [my emboldening]’.146 It ranges widely over balances that need to be struck in improving health: incentives and targets, national and local standards, health and social care, primary and secondary care, treatment and prevention, health gain maximized and care delivery setting made the most appropriate and efficient, audit and public engagement.

Box 6.1 focuses on standards, processes and delivery.147 The report says that ICT standards must be set centrally but does not go further. In Section 6.18 it concludes that poor performance in ICT reflects inadequate budget and lack of standards.148

Section 6.21–brings the report to a crucial conclusion:

If these issues can be addressed, the review believes that national, integrated ICT systems across the health service can lay the basis for the delivery of significant quality improvements and cost savings over the next 20 years. Without a major advance in the effective use of ICT (and this is a clear risk given the scale of such an undertaking), the health service will find it increasingly difficult to deliver the efficient, high-quality service which the public will demand. This is a major priority which will have a crucial impact on the health service over future years.149

Other sections are also of significant note: Section 6.36 on collaboration with the private sector; Section 6.40 on balance of health and social care–which it believes to be wrong, where acute bed costs are three hundred pounds per day (Section 6.46).150 It sees Primary Care Trust’s (PCTs) control of fifty percent of NHS budget in 2002 rising to seventy-five percent in 2004. Box 6.3 focuses on new balance of diabetes care;151 Section 6.81 on partnership rights and responsibilities and better ways of keeping the public informed;152 Section 6.91 on ownership of health status.153 The report looks further at demographic trends for 2020–40 that are likely to adversely impact on health care services.

In relation to the Summary Recommendations in Section 7.6, the report recommends unifying health and social information resources (A.4) and improving modelling with better ICT (A.9).154 In its forward look, it brings together two contributory trends: population health (dependent on age structure, genetics, lifestyle, effectiveness of health service; C. 29); and tomorrow’s patient (who will be better informed, educated and affluent but who will have less time and be less deferential to professionals, and who will be able to compare the service against alternatives, wanting more control and choice; C.38).155

On reflection, as I review and write this now, it is difficult to imagine a more thoroughly executed and thoughtfully appraised policy review on health and social care. Wanless was extremely impressive as a presenter and I have no doubt he was listened to, despite the much-telegraphed Number 10 versus Number 11 (Prime Minister versus Chancellor) Downing Street politics of the era. One can only conclude that the central political culture of the NHS within government had no means to translate it into action. It lacked traction in how to move forward on the central issues, although powerfully and persuasively highlighted, about effective use of ICT and integrated health and social care, especially.

2002–National Specification for Integrated Care Records Service–UK Department of Health

This was almost the last throw of the dice for the NHS Information Authority, before the NPfIT juggernaut rolled in.156 It lumped together a huge set of existing data sources and services currently operational within the NHS, directly and indirectly supporting patient care, that needed to be brought into an integrated infrastructure. It offered no architectural or design solutions and was more a voluminous statement of requirements.

I was subsequently asked to take part in a review of some seven hundred proposals from across the NHS, seeking funds from a pot of money allocated to projects showing how they would integrate their patient records along these lines. It was a salutary exercise. I saw almost no proposals that extended much further than a bolting together of existing data sources and their presentation to users through a Web integration engine. An openEHR-coordinated proposal, from Bill Aylward and me, linking OpenEyes and other open-source patient records onto a platform we called Orsini–Open Records Standardization INItiative–was put together. Basically, what in due course came to life in the EtherCIS and then EHRBase platforms, as described in Chapter Eight and a Half. We were advised that the Treasury had diverted some of the allocated funds to cover an NHS overspend that year, and the project, although shortlisted, did not secure support.

2003–The Quest for Quality in the NHS–A Mid-term Evaluation of the Ten-year Quality Agenda–Nuffield Trust

Improvement in the quality of care was a major concern addressed in a set of consultation documents and White Papers in the years following 1997. At the mid-point of the NHS ten-year quality improvement programme, the Nuffield Trust published an evaluation of progress.157 It set four objectives:

  • a review of the vision, strategy and structural changes that underpin the quality agenda;
  • a synthesis and presentation of data to evaluate quality in multiple dimensions;
  • an in-depth analysis of key components of the quality agenda, including the role and contribution of organizational culture, primary care, patient engagement initiatives, information technology and public reporting for accountability;
  • summary analysis and recommendations.

It was useful in drawing together the welter of new initiatives and acronyms, which it described as ‘numbing’. By 2004, these had included:

  • National Institute for Clinical Excellence (NICE): created in 1999 to publish cost-benefit analyses for technologies and pharmaceuticals;
  • Commission for Health Improvement (CHI) and later Commission for Healthcare Audit and Inspection (CHAI): established as an independent regulator of NHS Performance;
  • Modernization Agency: established in 2000, to help local clinicians and managers redesign local services around the needs and convenience of patients;
  • National Patient Safety Agency (NPSA): established in 2001, to coordinate efforts to learn from and prevent adverse incidents after the scale of these had been revealed in a 2000 report of a team led by the CMO [Chief Medical Officer of the Department of Health]–eighty-five thousand per annum with one out of five incidents leading to disability or death;
  • National Clinical Assessment Authority (NCAA): established in 2001, to provide support to Trusts and Health Authorities faced with concerns over the performance of individual doctors;
  • Commission for Patient and Public Involvement in Health (CPPIH): established in 2003, as an independent body to champion greater public involvement in health-related policy and decisions.

The report talked of the need for ‘development of routine data collection, analysis and reporting capability to monitor quality’. It listed a set of critical tasks covering standards setting, development of quality measures, data collection and analysis, leading to the design, based on evidence, of ‘interventions to predictably improve patient care’.158

The recommendations were couched, predictably, in the language of further new initiatives!

Recommendation 1: Establish a National Quality Information Centre. […] England’s Quality Agenda simply cannot thrive in an environment that is deficient in access to valid, reliable data, and in the necessary analytic and interpretive skills for expert performance evaluation and credible reporting […] Rectification of these problems calls for a comprehensive strategy that encompasses an information systems infrastructure, electronic patient records, and expert informatics. […] The NHS is deficient in well-organized data that produces coherent, defensible, credible, and actionable analyses of system performance and clinical quality. The lack of a shared robust information base that provides a common understanding of the NHS’s strengths and weaknesses jeopardises the quality agenda and prevents the various organizations and initiatives from living up to their potential.159

There is little if any discussion about how quality improvement might better be rooted in support for self-assessment conducted among clinical practice teams at the coalface of care.

Two authoritative books were published by the US Institute of Medicine (IoM). A great colleague of the times, Donald Detmer, played prominent roles, both at the IoM and in the UK, on sabbatical in Cambridge at the Judge Institute of Management and on the board of the Nuffield Trust. Two inukbooks of the times, published by the IoM, were: Crossing the Quality Chasm and Computer-Based Patient Record.160

2004–Diagnostic Audit 2003–04–UK Audit Commission

The Audit Commission investigated the state of information systems across the ninety-three NHS Acute Trusts. I discuss this report in the section of Chapter Eight devoted to the work and contribution of my colleague, Jo Milan, in the context of the Trust that was, by far and away in the report, the outstanding exemplar of high quality, clinically valued, paperless and cost-effective information services. He was the physicist, engineer and IT architect and lead who made it so. More could have been learned in studying the design and understanding the success of this exemplar at the Royal Marsden Hospital in London, than from the writing and reading of all the other Whitehall and NHS reports of the era, put together.

A health minister of the times, Helene Hayman, asked me where the IfH Programme of 1997 might be suitably launched, and I recommended The Marsden. It was a great day and the Secretary of State, Frank Dobson, The NHS Chief, Alan Langlands and the IfH author, Frank Burns, gave strongly supportive talks. Jo’s team’s work was presented to the press. For whatever reason, NPfIT subsequently ignored it completely. Jo contributed hugely to the early days of the GEHR project and openEHR, as I record in Chapters Eight and Eight and a Half.

2005–World View Reports–UK Department of Health, Denis Protti

Another good colleague of the era was Denis Protti, who, like Donald Detmer, came to the UK on a sabbatical visit and was commissioned by the Department of Health to write a set of reports summarizing health informatics research and development of the era.161 It was an interesting and informative collection, summarizing initiatives across a wide range of activities.

2007–e-Health for Safety–Impact of ICT on Patient Safety and Risk Management–UK NHS

This document came from the NHS team dedicated to patient communications and safety.162 It summarized well-established work on clinical risk and suggested how ICT might fill the gaps in information that lead to harm. I heard the clinical lead for the work present on this theme at the Royal College of Physicians. I asked them about risk arising when ICT systems fail or are intrinsically incapable, through design, of communicating with one another correctly, fully, or in meaningful context. It seemed a thought that had either not occurred to them, or was deemed of minor, esoteric significance.

I had a similar experience at a USA/UK intergovernmental conference on care quality, where geographical information systems were under discussion, in relation to public health services. I found myself in conversation with a national legal ombudsman from a prominent Commonwealth country and the Head of the UK Care Quality Commission, at a coffee break. The preceding talk had demonstrated drilling down through public health datasets, to identify geographic proximity of the homes of unidentified people presenting with communicable disease. I asked them how they saw the legal data protection framework interacting with such information utility, where it would be straightforward to identify individuals from supposedly anonymized datasets. They saw no difficulty, saying it was surely a simple matter of process, to safeguard against such deidentification of the data!

2016–Making IT Work: Harnessing the Power of Health Information Technology to Improve Health Care in England–The Wachter Review, UK Department of Health

The early years of the twenty-first century were marked by political turmoil, war and financial crisis. Understandably, the eyes of government were not on the progress of health reforms. Politicians had set major programmes in motion and their eyes turned elsewhere. It was a good while before the reality of continuing and growing turmoil in health care services rose back up the political agenda, and thus a prudent time for new wide-ranging review!

Here from the terms Terms of Reference of the 2016 Wachter Review–déjà déjà vu!

The review will inform the English health and care systems approach to the further implementation of IT in health care, in particular the use of electronic health records and other digital systems in the acute sector, to achieve the ambition of a paper-free health and care system by 2020. It will have a particular focus on issues around successful clinical engagement with implementation.163

Here, from the conclusions:

We believe that the NHS is poised to launch a successful national strategy to digitize the secondary care sector, and to create a digital and interoperable health care system. By using national incentives strategically, balancing limited centralization with an emphasis on local and regional control, building and empowering the appropriate workforce, creating a timeline that stages implementation based on organizational readiness, and learning from past successes and failures as well as from real time experience, this effort will create the infrastructure and culture to allow the NHS to provide high quality, safe, satisfying, accessible, and affordable health care.164

And:

The experience of industry after industry has demonstrated that just installing computers without altering the work and workforce does not allow the system and its people to reach this potential; in fact, technology can sometimes get in the way. Getting it right requires a new approach, one that may appear paradoxical yet is ultimately obvious: digitizing effectively is not simply about the technology, it is mostly about the people. To those who wonder whether the NHS can afford an ambitious effort to digitize in today’s environment of austerity and a myriad of ongoing challenges, we believe the answer is clear: the one thing that NHS cannot afford to do is to remain a largely non-digital system. It’s time to get on with IT.165

The what is the same. The how is two words–standards and interoperability. The invocation, as ever, is to bite more bullets, albeit sadly now with long broken teeth. The question that politicians might better have asked is why, given the history, are these things being said pretty much as they were said twenty and fifty years ago. And what does this mean for health care services moving forward. Health care IT has been on a long runway and runways do end. Some planes do not, or cannot, take off. We must cast our eyes more widely over passengers and crew, destinations, modes of transport, kinds of machines and means of navigation. Wachter pointed to people not technology. The next major review focused there.

2019–Preparing the Health Care Workforce to Deliver the Digital Future–The Topol Review, UK Department of Health

This review was led by another eminent US clinician and professor of medicine. He comes from a long line of eminent US academics who have stood high over medicine and IT, back to Barnett at Massachusetts General Hospital (MGH), whose contributions featured at the start of my archive. They have lived in a wealthy environment where health care expenditure is the highest, both per capita and in proportion to GDP, and in total amount, in the world, where private medicine predominates and where research and industry are well-funded and organized. It is a country that has long featured strongly politicized and polarized debate about population health and individual health care–the individual has access to the best in the world, and the population overall fares poorly. Individualism and socialism are the ’isms of political tribes.

In my experience, such leaders value and admire the NHS for its mission and culture but believe, and live their lives, in the resource rich environment of a different mission. They seek the cohesion of an NHS and wish to add to it the science and product they create and use in their mutually supportive settings of academia, health care, and commerce. The business of health care is its central focus. Such inequalities that prevail are down to the individual to put right on their own account.

One comment stood out as I read the report, which reiterated the NHS focus of many years on objectives for management, not for clinical care. I recalled a precisely similar comment by Douglas Black, President of the Royal College of Physicians, in a leading article in the prominent British Medical Journal (also discussed above), commenting on the Körner Report in 1982, where he also said that good management of health care services is important and so is good management of patient care. They are not the same thing, but they are not separate things–they connect. Black quoted a Kings Fund paper, as follows: ‘Information technology is only exploited to the full when developments are information led, so that the information requirements must be identified first and only then a choice made from the wide range of technology available’. He adds that ‘the point could perhaps be made more simply–“Don’t choose a computer until you know what you want to do with it”’.166 This much has long been known and has long exercised government.

In the Topol Review conclusions we find a welcome strong emphasis on people, implementation and learning:

This is an exciting time for the NHS to benefit and capitalise on technological advances. However, we must learn from previous change projects. Successful implementation will require investment in people as well as technology. To engage and support the health care workforce in a rapidly changing and highly technological workplace, NHS organizations will need to develop a learning environment in which the workforce is given every encouragement to learn continuously. We must better understand the enablers of change and create a culture of innovation, prioritizing people, developing an agile and empowered workforce, as well as digitally capable leadership, and effective governance processes to facilitate the introduction of the new technologies, supported by long term investment.167

Here is the scope of the report, from its Table of Contents:

  1. Introduction
  2. Ethical considerations
  3. The top ten digital health care technologies impacting the workforce
  4. Genomics
  5. Digital Medicine
  6. AI and robotics
  7. Health care economics, productivity and the gift of time
  8. Organizational development
  9. Providing a learning environment for education and training168

And, in case you thought I would not mention it, yet again, Section 6.2 notes as its first priority that ‘For data-driven and autonomous technologies to flourish the following are required: the digitisation and integration of health and care records; […]’.169 But only a ‘what’ and, as ever, with no sign of a ‘how’, or any evidence of learning from past efforts. No USA Presidential seeing of the world ‘as it never was and wondering why not’.

2010 and 2020–The Marmot Reviews

This 2010 review, and its update in 2020, revisited the social disadvantages that were surveyed in the Beveridge Report of 1942, but with a now more specific demographic and epidemiological focus on health inequalities in the very different society of today.170 The trend over the past decade is correlated with the pattern of reductions in government expenditure after the financial collapse of 2008.

It is focused on the factors that impact as social determinants of health–how disadvantage due to poverty and disability are associated with declining health and lower life expectancy. It shows how health outcomes have stalled over the past decade, and policy measures have exacerbated decline by disproportionate withdrawal of support from those most in need.

The recommendations are hard-hitting and focus on advocacy of health policy as the foremost responsibility of government. The recommendations prioritize a national focus on the needs of children and support for families in poverty. They place considerable emphasis on the need for a holistic approach, coherent at local and government policy levels, working to ensure good local work opportunities for all citizens, combined with a safety net of state benefits centred on a guaranteed minimum wage. The report does not cover the design and operation of health care services, although there are clearly substantial dependencies between these and the wider issues of poverty and inequity in society, which the report charts with great clarity.

1970–2020–Fifty Groundhog Years

Parturient montes, nascetur ridiculus mus [the mountains will go into labour, and a tiny little mouse will be born]171

I see a double message here, no doubt unintended by Horace! The obvious one is that largescale endeavours can lead to incommensurately small-scale outcomes. That certainly applies in relation to the mountains of money that have been spent on IT systems, overall, often yielding relatively small benefits in the delivery of health care. More idiosyncratically–and remembering that muscle comes from Latin for ‘little mouse’–is a message that the little things can emerge as powerhouses of the big things in life. Little Data, as I have discussed elsewhere, is what Big Data is built from. Little things can operate below the radar of the big. Simple things can provide keys to unlock the intractable complexities of bigger things.

In his book that I drew from in Chapter Six, Ian Stewart described a simple mathematical insight that unlocked understanding of the configuration of viruses of increasing size.172 As also discussed in that chapter, John Wheeler surmised, in setting out his ‘it from bit’ ideas, that the key to unravelling many contemporary unknows of fundamental physics may prove much simpler than current complexities might indicate.173 Quite simply stated positions can unlock complexity and also focus action. The key to their success is that they align purpose, goal and method with traction in enabling, making and sustaining effective action. For example, a single newspaper article sixty years ago coining the term, ‘Prisoner of Conscience’–individuals imprisoned for opposing powerful governments–led to the simple and apolitical action of writing letters on the behalf of these individuals and sending food and clothing to support them and their impoverished families. It was simple to get involved and it led to the worldwide movement of Amnesty International.

In my review of key documents along the timeline surveyed in the preceding section, the Marmot Reviews seemed fitting final documents to place in apposition to the first OTA Report of 1977, on medical information systems. Technology has changed beyond recognition since that long-ago report, but core issues it identified, affecting successful implementation, remain substantially unchanged–not so much perennial as ‘per-multi-decennial’! Health care has likewise changed beyond recognition since those times, but Marmot charts inequalities of health that are, in his estimation, stalled or getting worse, with uncomfortable comparators to those highlighted in the 1942 Beveridge Report. How has society’s transition into the Information Age been implicated in these stark realities, I wonder? I reflect, now, on the fifty Groundhog Years of health information policy.

On reading again, the Wachter and Topol reviews of 2016 and 2019, and thinking about what has changed since the mid-1970s, when I started my first academic post at Bart’s in medical computing, my reflections focused on what has not changed. Remedies are prescribed and swallowed, repetitively, as the problems repeat. A bit like an inappropriate drug treating recurrent indigestion. Here, again, are the presciently expressed mid-1970s concerns about future policy for medical information systems, as expressed then by the great Octo Barnett and the team assembled with him. I’ve labelled, numbered and emboldened them, to correlate with my following comments on how the intervening years have played out.

  1. Policy: ‘Without a federal policy towards these systems, their diffusion may well proceed indiscriminately, and standardization will not be possible. If so, the full potential of medical information systems is not likely to be achieved’.
  2. Adaptability/Agility: ‘Prototype medical information systems have been proven technically feasible, but most have not yet been made adaptable to the various conditions of different institutions. In order to realize the benefits of a standardized database and to market systems economically on a large scale, flexible systems are required’.
  3. Granularity: ‘The capability to accumulate and retrieve data for each patient is critical for both the process of patient care and research’.
  4. Combined clinical and business/administration needs: ‘An important capability […] is to provide necessary data for administrative and business needs’.
  5. Mutual understanding of clinical and engineering domains and need for long-term investment: ‘[Common reasons accounting for early failures in the 1960s were seen to be] inadequate understanding of the complexity and variations in medical care, inadequate computer hardware and software, and inadequate commitment of capital for long term development’.
  6. Clinical standardization: ‘At present, lack of standardized nomenclature or established protocols in medical care continues to constrain the development of a generalised database’.
  7. Diversity of non-communicating architectures and technology dependence: ‘Because medical information systems have been developed through the independent efforts of many investigators, today’s systems reflect diversity of philosophies and technical approaches’.174

They looked at two paths ahead in the wood. The first is a free market, which they considered too risky.

  1. Option to allow a free market to develop:
    • ‘The federal government could continue current policies and allow adoption of medical information systems to be determined in the open marketplace. However, this policy could result in medical information systems being marketed and adopted without additional investment in research to improve certain capabilities. Because capabilities to improve and monitor the quality of medical care and to facilitate research and planning are the least developed and require standardization, these potential benefits for patients and the medical care system might be lost. Computer systems limited to administrative and financial functions could continue to dominate the market. Medical information systems that might be used could also lack high standards of quality or provide inadequate protection for the confidentiality of patient data’.175

They proposed a second approach: the central shaping of the market with investment incentives to encourage coherence in knowledge bases and databases, encompassing language and workflow.

  1. Proposal for a national authority to coordinate systems design, common datasets and protection of confidentiality of patient records:
    • Central organization to develop, validate, and maintain the knowledge content of medical information systems.
    • Standardized databases, to include nomenclature, terms, definitions, classifications, and codes for use in systems.
    • Guidelines for precise standards to protect the confidentiality of patient data.176

Reflecting on the intervening decades from the mid-1970s until today, numerous issues have emerged, mirroring the concerns highlighted in the 1977 OTA Report. There has been progress mixed with regret and disappointment in relation to national programmes, internationally. These have encountered issues that cannot be resolved by any amount of government spending and a new approach is needed.

Expression of clinical and health system requirements and capable and proven technology to meet them have typically been, or quickly became, a poor match, failing to synchronize and keep pace with one another. ‘Imagineering’ is the application of imagined method to meet poorly framed requirement; it results from failures of discipline, profession, industry and working environment. It is akin to building bridges with little understanding and experience of stresses and strains in mechanical structures under load, and the context of their use. Expensive wobbly bridges have been writ large within health care software systems of our age. We have seen them fail but have not understand or learned the nature of the wobbles and collapses.

  1. Policy

The OTA foreboding has been borne out by events. Health care IT has been a huge and consequential policy and market failure; it costs too much for delivering too little–in terms of both money and burden on frontline care. I do not think one needs to read more than the Beveridge report, the OTA Report, the Wanless Report, the Topol Review and the Marmot Reviews to get answers to almost every question, save one, that policy must address. This is the most important one–how? These are issues central to health care, in terms of professional practice, education, research, management and governance. How is it that the NHS has always looked to senior and experienced clinicians from the USA to guide its policy, given how US foundations, commentators and presidents have appraised achievements there. And why have they looked to almost every discipline and profession, save two, to lead its plan? It has handed the mantle of leadership to a succession of appointees who all went twist and bust. An NHS manager, a physicist, a computer scientist, a hospital IT manager, a management consultant, a journalist, a civil servant and a diplomat. But never to a professionally trained and experienced clinician, versed and trusted in the complex realities of coalface clinical care, and never to an experienced engineer, trained and versed in the architecture and implementation of complex engineering systems. That says a lot about the repeating failure of traction and the competence to construct and execute a realistic policy and plan. It says a lot about the clinical professions, too, that they did not use their power and influence to insist otherwise, other than in the wise and unheeded advice of Douglas Black at the time of the Körner report, when hyper-and top-down managerialism took a much stronger hold at the centre. It says a lot, too, about the elitism of politics and the derogation of the importance and contribution of engineering in making things happen. The leaders of professional bodies have a lot to do. I suspect they, along with most hospital managers, have been fearful of career suicide by becoming too involved.

Of course, health care was not alone in these kinds of failure within the public sector. Many such failures have common origins and distinguishing features as well. At the heart of the policy failure has been one of ownership of the domain. Health care information systems are instantiated within a community of three communities–of citizens, services and businesses. It is a domain where each of the three has a characteristic interest, all fundamental and in need of one another. Each needed to change, and each needed to learn from the others, and thereby learn about itself. The intersection of these evolving interests is the wider community interest they all serve–that is the proper focus of policy, and where governance and trust must be earned. Each of the three has a different perspective on the issues they face as a group–about implementations that interface at the coalface of care, and the secondary interfaces of education, research and management that are integral with the health system, and with the processes, technologies and governance they entail. The conflicts of interest that inevitably arise can only be resolved in the context of overarching community interest, common ground and joint implementation endeavour.

This is where the future care information utility, serving a coming era of Information Society health care, must be owned, and positioned as an evolving reality, seeking towards better balance, continuity and governance of services and efficiency and effectiveness of the methods they employ and their validation. Today, clinicians, managers and technologists sit around a circle, and blame circulates clockwise and counterclockwise. The patient sits in the centre and feels bemused, and everyone blames the politicians, watching from a circulating helicopter and throwing the occasional bags of confetti money and defending themselves to one another.

  1. Adaptability/agility

There has been a deficit of sustained, coherent, clinically-informed and -led policy, and appropriately targeted resources. Design, implementation and practice have proceeded piecemeal. This has, in large part, reflected barriers that practising clinicians have experienced, or by default imposed, limiting their practical engagement with an area so fundamental to their work. Good development and prototyping tools were not available to help them in this.

A 1970s mindset of IT has permeated throughout, conditioned by the waterfall model of system development and implementation, as essentially a sequential process. Systems today are designed and implemented using more agile methods, which recognize the chameleon-like qualities of the problem addressed. System architects need to be able to rescope and redesign their work as its practical implementation and use evolve. We have learned a great deal about the stacks of software that integrate from the local user interface to server farms and data stores distributed in the Cloud. We have learned new discipline, and accessible technology now spans these dimensions.

  1. Granularity

We have also started to turn the world upside down and work from the patient outwards in the methods for structuring and persisting data, so that it can be searched and analyzed with generic methods and software tools. Architecture can now embrace a hierarchy of granular and structured detail about patient care combined with less structured data. The OTA recommendation is that systems must be able, as a priority, to answer all questions about individual patients and their care, and as a secondary purpose also provide valid data at organization and population levels, with no further data collection burden imposed on frontline staff. This is simple to state but extends deep into issues of architecture and design, where these can only be arrived at iteratively over time, testing ideas at each stage in a real-life context. This scarcely ever happens in top-down driven implementation. We need the tools and teams to enable it to be approached from the coalface of care, outwards and upwards. This has been the unifying focus of the pioneers I describe in Chapter Eight.

  1. Combined clinical and business/administration needs

Health services require a wise mix and capacity of health care professional expertise, combined with efficient administrative, managerial and governance arrangements. It is teamwork and it is trusted team culture that holds it together at all levels, as the demands are intense. The efficiency and effectiveness depend on coherent, accessible and unburdening information systems. Repetitive capture of data and incoherence of its forms and applications are costly. The work of the organization is coalface care, and the information systems require an architecture that supports the coherence and continuity of care, at all levels, where it is most effectively delivered.

  1. Mutual understanding of clinical and engineering domains and need for long-term investment in innovation

There is a need for a workforce skilled and experienced in both the technical and clinical domains of information technology and informatics. The NHS once grew such a cadre of staff and proceeded to weed it out or demote it to an administrative role, principally devoted to management of outsourced service contracts. This occurred when unknowledgeable managers perceived this wider home-grown experience and expertise as inessential for the procurement and implementation of IT systems that were needed, and that it could be left to suppliers of systems to provide them, as necessary. This resulted in the service effectively outsourcing a crucial area of expertise central to its ongoing knowledge and development. The marketplace did not, and could not, grow that breadth of on-the-ground capability. Its staff were focused on selling and installing their own bespoke technology, minimally adjusted to the bespoke needs of client organizations. That is a pathway towards a market dominated by very undesirable monopoly.

It also serves to constrain innovation within a product-, specialism- and organization-focussed ecosystem that is ill adapted to foster and lead radical innovation. Such innovation should draw on and harness the potential of new device technology, information systems and networks to invent new methods of measurement, review and intervention, in support of safe and effective health care services that can now be delivered in, or much nearer to, citizens’ homes, and which can be operated and overseen there by themselves, their carers and their community-based professionals.

We tend to think of these trends in the context of affluent country requirements and their costly health systems. Such radical reinvention of care service delivery would be of equal, if not greater, applicability in developing world contexts where workforce scarcity and remoteness of communities from the nearest clinics and hospitals is of a different order, and yet where access to low-level satellite mediated broadband at one hundred and twenty megabits per second, backed up by unfailing, battery-stored solar energy, currently continuously reaches the most remote Aboriginal communities situated many hundreds of kilometres from Alice Springs. I was talking to this service’s medical director, my openEHR co-founder Sam Heard, and discussing this reality, only this morning, in one of our regular weekly chats that brighten both our lives.

  1. Clinical Standardization

The clinical importance of standardization of data has been a rallying call from the start, with a great deal of learning required, and underestimated, as to how to achieve this. Clinical practice has had a lot to learn about itself in its encounter with the computer. Marrying of the disparate worlds of clinical and technical standardization has been erratic. It has not been a well-conceived and thus well-owned process. Efforts towards standardized frameworks for computer-based methods and systems, as integral components of clinical methods and service delivery, have struggled to align within the total health system that supports the maintenance of health and the diagnosis and treatment of disease.

Some of what has been attempted was akin to taking software that implements the TCP/IP standard (Transmission Control Protocol and Interchange Protocol) that underpins data network communication and expecting to use it as a basis for standards defining the meaning and content of the messages themselves. More absurdly, to emphasize the point, it has been somewhat as if the librarian profession was being charged with defining a unified field theory of physics, when deciding a basis for cataloguing the physics literature!

Efforts towards standardization of clinical nomenclatures and knowledge bases sprang from the efforts of academic departments and professional bodies, seeking coherence and discipline in these endeavours. Within well-bounded domains, such as imaging, laboratory services and instrumentation for patient monitoring, interfaces have been sufficiently clear that standardized approaches for data capture and management could be evolved from within those communities of practice, be taken up safely within devices in use, and used to share their data more widely within health care records.

Other than in the domains of medical language and terminology, and of relatively well-defined and encapsulated domains of computerization, such as imaging and laboratory systems, the standardization of clinical data has been predominantly conducted as an exercise in technical standardization, pursued in a mixture of industry and government, and inter-governmental bodies, in a consensus building process more than an experimental one. These processes have lacked recognition of the fundamental message that surfaced in discussion of formal logic and knowledge bases in Chapter Two, that much of medical knowledge and data, including that relevant to individual patient care, is highly context dependent. Chapter Two rehearsed the defeasible and indefeasible components of knowledge bases. Records of care must capture that relevant context if they are to convey meaning and be capable of reliable grouping and analysis over time. The formalizing of clinical data standards is a huge area of interface with the onward development of information systems, and closely related also to data protection principles and measures, that require to be standardized, too.

The OTA Report exposed this issue, long ago, and it has reverberated through the decades, at all levels of endeavour, resulting in new datasets, new governance and new law. There is very little continuity of practice in the systems in use, between different institutions and levels of health care. Only when clinicians are enabled to step up, engage and participate as equal partners in system development, with tools that mirror their interests and concerns, can properly experimentally-based ecosystems of standardized clinical data, within standardized technical infrastructure, become a tractable and sustainable proposition.

Lacking the synergy of a common and shared methodology that provides a provenly implementable answer to the how? question, products arising will continue predicably to prove inadequate, inflexible and constraining of choice within the marketplace. The products, in turn, are then unduly costly and burdensome for fulfilling the tasks they are expected to support. Frustrated efforts towards standardization of data and methods have impeded efforts to move from prototype to product, and integrate, locally and at scale. By default, standardization of health IT systems sprang from the industry players’ need for their products to communicate digitally with one another. It did not arise from the users’ need for them to communicate clinical meaning between different clinical record-keeping systems. Transparent sharing of methods employed for representing clinical content within and between systems was seen as a concerning threat to a proprietary product’s commercial viability. Its lack became an even more impactful threat to its clinical and organizational viability.

  1. Diversity of non-communicating architectures and technology dependence

The truth of this observation, the final concern raised in the OTA Report, has become ever more present and impactful over the intervening years. It has resulted in an unhealthy monopolistic tendency of markets, as purchasers despair of a more flexible and adaptable ecosystem of information systems and commit to a product that ties them and their data tightly to a single supplier of systems. Computer science and computational method have evolved continuously, rendering architecture, design, and implementation, and the skills they embody, quickly obsolete. Clinical science and the computational methods and systems required by health care services have also evolved, in parallel, along with clinical governance and the requirements for the certification of products. Part Three of the book offers a way out of this dilemma, that has been shown to be implementable and scalable. It has been achieved with a miniscule fraction of the resource spent elsewhere seeking solution to the challenge set out in the OTA Report from the 1970s.

  1. Option to allow a free market to develop

The OTA Report feared that an unregulated market would divert attention from innovation and improvement of the process and quality of care, towards a focus on the business of supplying systems, to the disbenefit of patients and lack of protection of their data. It made early suggestions for areas of federal intervention, and all over the world, governments have adopted a middle way in leaving systems architecture in the purview of the industry and regulatory requirements at national level. The problems that have become more evident over time relate to how to set a generic framework of regulation which can be implemented safely and efficiently in a plethora of architectures. This goes beyond agreement on datasets into how the data are persisted and processed within systems and communicated from one architecture to another across the different disciplines and levels of the care system.

  1. Proposal for a national authority to coordinate systems design, common datasets and protection of confidentiality of patient records

The OTA argued for a national body to hold the centre of the stage and many countries have followed that route. The problem is that the markets for systems are international, and the records of patients need to travel meaningfully between countries. To achieve this, greater discipline and rigour is required, which is openly and freely available and shared between countries. Standardized methods for handling and communicating meaning and context of care remains a requirement over and above what is decided to meet the needs of any one jurisdiction. This has been a difficult and contentious socio-technical and political challenge and other than in the areas of terminology that have been highlighted in Chapter Two, rather limited progress has been achieved in relation to whole records.

It should, all the same, be acknowledged that the NHS did in its early initiatives follow something of the OTA alternative blueprint to a free market. It created an Information Authority, it invested into clinical domain terminology and classification, it instituted wide-ranging data protection regulation. The problems that arose reflected misunderstanding and miscalculation of the nature and scale of the task, the environment and leadership required for tackling it, and the achievability, and resource and time required for realizing its ambitions, in changing clinical, technical and managerial contexts.

Over and above the prescient OTA appraisal of the unfolding domain, several other observations might now be added:

Corporate engagement:

Large corporations have serially dropped in and out of engagement with the challenges set out in the earliest reviews. Governments, likewise, have looked in detail and then looked elsewhere. Focus on methods and quality of clinical care and management of services, have lacked synergy of approach with data and record management. The marketplace has often appeared as a Wild West kaleidoscope of money, power, circumstance, technology and obsolescence. Either that or as an orchestra of untuned instruments and frantic conductors with ineffectual waving arms.

People and environment:

There has been a dearth of good and appropriate multi-disciplinary and multiprofessional environments in which committed teams could learn from and inform one another, to make and sustain progress in designing, implementing and operating sustainable and integrated information systems, working on this from the coalface of care.

Developmental tooling and infrastructure:

Pioneers work with head, hand and heart, and while their head and heart have been able to engage, their hands often have not. Some have built tools and infrastructures with which to make progress, but few of these have survived, scaled and matured as products, infrastructures and services. A much more coherently tooled ecosystem is required to enable active and effective clinical engagement with the domain.

Failure to learn:

Putting all these domains together in the context of a computerized individual patient record, sharable among systems and technologies, as envisaged from the earliest reports, remains contested territory, populated by opposing ideas and ambitions.

Efforts towards computerization frequently expose new questions about the foundations of discipline and practice, as illustrated many times in the preceding chapters. The pursuit of quality improvement in health care practice has become entwined with disparate challenges that need to come together as one: the reform of health care services; information utility supporting balance, continuity and governance of care services; team building to provide the range of skills and competences of health care professionals, engineers and scientists that can be trusted and relied on at the coalface of care.

Legacy:

There has resulted a patchwork of underperforming, redundant and unsustainable, ever more costly legacy of information systems. Surely the time has come to address, more deeply, why this situation has been such a hugely more difficult and consequential problem and ambition for medicine and health care services, in comparison with other sciences, professions and sectors of the economy and their supporting industries. I hope this book may be a useful contribution to this important quest. At heart, for me it is a problem of language, logic and reasoning in the context of the clinical and care domain. The efforts to ‘computerize’ have exposed a mismatch between what science, management and technology can contribute to health care, and what are deemed good practice and outcomes that citizens need, and increasingly expect from their health care professionals and services.

Joseph Weizenbaum placed the fault-line differently, arguing that computer science was a spurious knowledge domain, imputing value in the coding more than in the practical method and content it enabled. That does sound a bit like saying that mathematics is a spurious knowledge domain, and we should rather value its applications! Pushed from many sides, about where reform and breakthrough in health care practice will come from in the coming decades, there is, all the same, a heightened sense of imbalance and unfairness, of rights and responsibilities, and of patterns of inequality in health, tracing back to those identified in Beveridge’s five giants of 1942 and reiterated in the Marmot Reviews of the past decade. Since these have persisted over those many decades, notwithstanding the revolutions of computer science and technology and now genomics science, one must wonder whether AI is destined to help, and whether they ought not to be the greater focus of our attention. There is greater awareness of imbalance, but little sense, still, of how to seek and promote redress. New balance can only be achieved with movement on all sides, supported, I believe, by reinvention of health information systems as a citizen-focused care information utility–common ground that all share in need of, and have a role in creating and sustaining, at the heart of the mission for a healthier and better cared for future Information Society.

Ivan Illich Revisited, Fifty Years On

To complete this long chapter, it is interesting to revisit Illich’s Medical Nemesis,177 to consider how the landscape has changed and adapted to the issues his books identified in the 1970s. Some key developments that appear to line up with the direction of travel he favoured are:

A further trend that supports Illich’s line of argument has been the escalating and increasingly unsupportable cost of hospital medicine and the burden that the application of advanced technology has placed on health care services. This has reflected new methods of acute medicine and care of the elderly and chronically ill in society, associated with increased human lifespan. The trend is well-recognized, and self-care and community delivered services are identified aims towards their resolution. Machine learning has assumed more concrete form. Advances in artificial intelligence and the novels of Ian McEwan and Kazuo Ishiguro have brought the prospect of humans living with artificial friends into a more plausibly realizable form.178 From the Illich perspective, such innovation would remain doom-laden for humankind.

Illich’s challenge was for society to redefine the disease focus of industrialized medicine into a focus on autonomous individual health care. That virtuous circle could perhaps be squared in a society where healthy lifespan and healthy lifestyle coexist, both locally and globally. No one can be protected from viral pandemic until all are protected. This still looks a long way off. An encouraging and more optimistic vision is that research and discovery, can now happen locally and propagate globally, as rapidly as news travelled in the local village. Within recent decades, the sequencing of DNA has evolved from a billion-dollar multi-year, multi-laboratory global effort into a single device that achieves much the same ends within hours or minutes. Candidate pharmaceuticals can be rapidly adduced, targeting visualized and quantum theory characterized receptors on cell surfaces. Epidemics can be tracked, albeit that the social, political and economic implications of control of pandemic remain intractable. And Honeywell’s quantum computer prototypes can now be accessed from the Cloud and bring promise of collapse of computer processing time on complex computational problems, from mega-millennia to hours.

This chapter has addressed fifty years of coevolution of health care with information technology and the status quo of today, where early fundamental goals have not yet been achieved. In what way should we re-imagine information for health care and reset our goals, and how should we gain traction in realizing them? That is where Part Three kicks off. To close, here, I reflect on another westward rush for gold!

Parenthesis–Goldrush

As with the ‘alchemy’ of money, vividly characterized as such in the modern age by Mervyn King, the alchemy of information has made and destroyed livelihoods and fortunes.179 It has created and nurtured emergent oligopoly and confused democracy. It has engendered a new goldrush to that same territory out west. Gold was searched for at the end of rainbows and magic bullets filled the air, missing their targets and exploding in nearby neighbourhoods. Information in the Information Age has assumed ever more strongly political and commercial guises and vestments. Data has been mined for money, power and influence, in basements, backstreets and penthouses, all over the world. It has been an information wars zone. There was triumph and disaster, redemption and retribution. There was once a dustbowl created on fertile land. Will bitcoin bite the dust? Is information mirage reverting to data sand? Will quantum circuits blow up a new sandstorm?

This is, no doubt, rather naff hyperbole, but it seeks to dramatize an unwanted future and an urgent need to find new and fertile common ground of information on which to help reinvent the future of health care services. Hype of each era, of whatever kind, on whatever topic, is naturally expressed in articles and histories focused on survivors and their successes. The Gartner consultancy even trades on ‘hype’, characterizing information technologies along a ‘hype cycle’–that is where the money is. In health care, it is money traded within a marketplace of products and services geared to the eyes of investors and purchasing power of organizations. It is a market structure that has led to multiple manifolds of non-coherent health care data about individual citizens, persisted in multiple ecosystems of inconsistent, mutually redundant, proprietary databases, focused on ‘what is in it’ for the investors, companies and organizations concerned. It is a market of costly and inflexible products, lacking an architecture of personal data and record that relates, first and foremost, to ‘what is in it’ for individual citizens in their relationships with multiple organizations of health care, and increasingly in using home-based devices and services that support them in meeting their individual health care needs, including for self-care, and of those they care for.

The reinvention of health care will require reinvention of the architecture and marketplace for health care information systems and services. The drivers for this must be individual citizens, health care professionals, and the provider organizations delivering health care. They alone have the experience, capacity and indeed the right and responsibility, to insist on a different architecture and marketplace, and set a different course.

The saying that those who do not learn from history are destined to repeat its failures is maybe not really true–we tend rather to make new mistakes, in new times, conditioned by new contingencies! We tend to shape our own conclusions from what we want or choose to see in the pattern of past successes and failures. But the impact of our actions can spread with extra force in times of information alchemy and great change, and we should observe and reflect on them carefully. It might also be argued that much detail of the past is redundant, insignificant, and best forgotten. The highly articulate presenters of the ‘The Rest Is History’ podcast series pondered the topic of whether we can learn lessons from history, in an early episode.180 Amusingly, they quoted a very early historian, Gregory of Tours, writing in the sixth century, who, they say, had the best book opening lines ever, throwing up his hands to express a minimalist overview of history: ‘A great many things keep happening–some of them good, some of them bad!’ That sounds a bit like the opener of The Tale of Two Cities!

Many decision makers, and even many practitioners, have not really experienced, let alone learned from, their failures. Failure can signal a personal negative, but the greater negatives are from organizations that do not learn or do not try to. Perhaps we should not dwell too much on the past, but we surely need to learn better in the present. Each era builds on foundations that it comes to take for granted. Troubles arrive when ground shifts too quickly under foot, foundations subside, and we have forgotten where we came from. Information pandemic is subsidence of a kind, in the foundations of society today. It highlights what we can and cannot currently understand, and safely do. Most experiments are conducted in local contexts and the impacts of failure remain local. It is a characteristic feature of global infrastructure that good and bad things can grow and spread quickly. Software viruses and bugs can turn up almost instantly on machines across the world, and experiments that fail can have wider significance and impact, too.

At a wild guess, perhaps ninety-nine percent, or even more, of the methods used to design and implement information systems over the past five decades are already obsolete. Sadly, a significant proportion of the systems they were used to build are still slogging on, with current work in some way still dependent on them. Knowledge about their design and the tools used in their development may no longer be extant. There is an experienced, but largely unseen, legacy of incompatibility and lost, now obsolete, art lying beneath the surface of software systems still in everyday use. There is a mirage of code, as seen from outside or from afar. What might look easy may have been very hard to achieve; what might look hard may have proved easy. Skilful programmers can find elegant and simple solutions to problems that others find complex and laborious to solve. And AI, too, can now write code!

Society has paid a very high price in creating, sustaining and living in this anarchic landscape. The sunk costs have been huge but have created rather less future value. New costs, both in shoring up the hole into which this legacy has dug us and in building out of it more sustainably for the future, are prospectively also considerable. How well are we learning as we go from this experience, by way of insight about how to do better in the future?

In the West, the locus of invention and creation has moved away from public sectors onto much wealthier, cash-rich commercial landscapes that can operate autonomously and often with power greater than governments. Only in relatively few environments–like the Harwell science campus, Daresbury, Culham or CERN (Conseil Européen pour la Recherche Nucléaire), or the science campuses from Massachusetts, through Oak Ridge in Tennessee, to California, for example–is comparable capacity and capability brought together and set free to spearhead major innovation today, where science, engineering and practical application can focus and advance in tandem.

Achieving a creative balance of support for public innovation and private industry, tuned to emerging new markets, is a well-recognized concern of our times. The power of global tech-based corporations, in their home and offshore bases, has risen markedly in the age of the Internet. The assumptions and rules governing these industries looks to be changing, with anti-trust legislation that regulates monopoly moving beyond law focused on avoidance of consumer detriment towards law based on utility and equity across countries.

The need for standardization, independent of vendor products and suppliers, has long been recognized in key areas of physical infrastructure. It is becoming a more insistent concern for software, too. Incompatible and proprietary products can entrench unhealthy monopoly, by making change away from them intractable, or the cost of doing so too high to be afforded from current budgets. Switching software and system often entails high cost for semantically safe and accurate migration of historic records, sometimes prohibitively so. Such migration was attempted unsuccessfully in several hospital system procurements I have observed. This matters, increasingly, for coherent lifelong records of care.

We can and must now raise our sights to do much better. Part Three of the book, to which the storyline now moves, is optimistically forward-looking. It moves the focus to who, what, when, where, how and why questions. It is time to put away the ‘retrospectoscope’ and take out, not telescope or ‘predictorscope’, but ‘prospectorscope’ and ‘cocreatorscope’, to seek out and fulfil the Dreaming181 of a future common ground for a coherent, citizen-centred care information utility.


1 Adventures of Ideas (New York: Macmillan, 1933), p. 234.

2 K. Gebreyes, A. Davis, S. Davis and M. Shukla, ‘Breaking the Cost Curve’, Deloitte Insights (9 February 2021), https://www2.deloitte.com/xe/en/insights/industry/health-care/future-health-care-spending.html

3 T. S. Eliot, Little Gidding (London: Faber and Faber, 1943).

4 The history is told in S. Riedel, ‘Edward Jenner and the History of Smallpox and Vaccination’, Proc (Bayl Univ Med Cent), 18.2 (2005), 21–25, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1200696/

5 These issues echo in the Marmot Reviews of recent years, on social inequalities of health, today (M. Marmot, Fair Society, Healthy Lives: The Marmot Review: Strategic Review of Health Inequalities in England Post-2010 (London: Marmot Review, 2010); M. Marmot, ‘Health Equity in England: The Marmot Review 10 Years On’, BMJ, 368 (2020), m693, https://doi.org/10.1136/bmj.m693. Michael Marmot was a senior colleague at UCL when I returned there in 1995). Marmot combined academic aristocracy with sustained research focus on public health. Like many epidemiologists I got to know, he had trained as a doctor but did not pursue a clinical career. He has been an assiduous gatherer and publisher of data, using his passion and organizing skills to create and sustain long-term longitudinal studies and engage widely and internationally in health policy issues of the day.

6 D. Goodhart, Head Hand Heart: The Struggle for Dignity and Status in the 21st Century (London: Penguin Books, 2020).

7 This was in the years before H2 receptor antagonist, proton pump and H-Pylori, the science of which was developing in tandem collaborations between academic pharmacology departments and industry, led by James Black (1924–2010). Black was a colleague at University College London (UCL) in a triumvirate of supremos, with John Vane (1927–2004) and Salvador Moncada. They shared common working links with the Wellcome Foundation pharmaceutical company and London Universities, laying scientific foundations for the future global pharmaceutical industry. Black is remembered for the invention of the cimetidine and propranolol drugs and was awarded the 1988 Nobel Prize for Physiology or Medicine. Vane led the unravelling of the mechanisms of aspirin and angiotensin-converting enzymes inhibitors. He was a co-recipient of the 1982 Physiology or Medicine Nobel Prize for discovery of prostaglandins. Moncada worked with Vane at the Royal College of Surgeons, then at The Wellcome Foundation, where he played a seminal role in unfolding the biological function and metabolism of nitric oxide. Many puzzled as to why he was not recognized for this in the 1998 Nobel Prize in Physiology or Medicine, which celebrated that field of discovery. After leaving the Wellcome Foundation, Vane came to establish the William Harvey Institute at St Bartholomew’s Hospital (Bart’s), in the building adjacent to my office at the time. I worked with Moncada when he came to UCL in the mid-1990s, as founding director of the William Harvey Research Institute, where he also led the work to draw together and coordinate UCL’s then rapidly growing range of specialist biomedical research institutes. We conferred on the development of a common strategy and team for their IT support services, that I was leading for the Biomedicine Executive Group.

8 I. Illich, Limits to Medicine: Medical Nemesis: The Expropriation of Health (London: Boyars, 1995), p. 109.

9 Disproportionately, that meant the capital city, London—but therein lies another story, persistent to this day, about the connection of medicine with politics!

10 P. Pullman, His Dark Materials Trilogy (London: Scholastic, 1997).

11 D. Ingram et al., ‘An Interactive Videodisc “Cancer Patients and Their Families at Home”, Designed for Education in Primary Health Care’, Journal of Audiovisual Media in Medicine, 15.2 (1992), 73–76.

12 E. Burke, The Works of the Right Honorable Edmund Burke, vol. I (New York: Little, Brown, 1877), p. 437.

13 A. Gawande, Better: A Surgeon’s Notes on Performance (New York: Metropolitan Books, 2007).

14 M. C. Escher, ‘Print Gallery’, Digital Commonwealth, https://ark.digitalcommonwealth.org/ark:/50959/3r076s71b

15 E. F. Schumacher, Small Is Beautiful: A Study of Economics as if People Mattered (London: Abacus, 1973).

16 M. Arnold, Culture and Anarchy: An Essay in Political and Social Criticism (Cambridge, UK: Cambridge University Press, 1869).

17 C. P. Snow, The Two Cultures and the Scientific Revolution (Cambridge, UK: Cambridge University Press, 1959).

18 R. E. Susskind and D. Susskind, The Future of the Professions: How Technology Will Transform the Work of Human Experts (Oxford: Oxford University Press, 2015).

19 K. Popper, The Open Society and Its Enemies: The Spell of Plato (London: Routledge and Kegan Paul, 1957).

20 T. L. Friedman, The World Is Flat: A Brief History of the Twenty-First Century (New York: Picador/Farrar, Straus and Giroux, 2007); F. Fukuyama, The End of History and the Last Man (London: H. Hamilton, 1992).

21 C. Darwin, On the Origin of Species by Means of Natural Selection, or, The Preservation of Favoured Races in the Struggle for Life, (London: John Murray, 1860), p. 58.

22 D. A. Sinclair, Lifespan: Why We Age–and Why We Don’t Have To (London: Harper Collins, 2019).

23 To gain an overview of this scene that has played out over many decades, I trawled through multiple sources to construct a timeline of the changes in UK legislation for health care services since 1948 and the creation, abolition and reinvention of organizational structures that were established to run, audit, regulate and promote change in its operations. I charted: eight Acts of Parliament, sixteen major reorganizations, six quality and service improvement agency initiatives, six overarching governance bodies and nine IT restructurings (see Appendix II of the additional resources, available at https://www.openbookpublishers.com/books/10.11647/obp.0335#resources). I also mapped the key official summary descriptions of these. I then logged, alongside, my experience of the forty years of what was called information management—the ways in which the many policy documents of those decades were translated into action. I have used some of this narrative to give context to the discussion of the policy documents themselves, later in this chapter. This provides an illuminating context for Part Three of the book, which is about creating the future. The history and battlefield are not pretty, and it is easy to criticize. The actors were placed among many rocks and hard places. They were all the time working upwards towards ministers and downwards towards the front line of services. Policy and investment were, by and large, framed in terms of management perspective and action at scale, from the top down. The need addressed would have been better framed and approached with actions centred more locally, within the professions and from the bottom up. Significant innovation, experience, and learning in the history of the field came from there. In Chapter Eight, at the start of Part Three of the book, I describe some of the great local innovators with global perspective that I have known. It is to such people, teams, and environments, and their stories, that we should look for resolution of recurrent crisis in the field and help their achievements to emerge and grow. In Appendix III of the additional resources, I describe more of my personal experience on the ground through those six decades. It is a subjective and partial perspective and others will have seen things quite differently. I place it in the additional resources as it is not central to my purpose in the book, of helping to illuminate and chart a way forward—I do not wish to stir tired or sleeping dragons, as that will not help!

24 DEC had its origins at the Massachusetts Institute of Technology (MIT) in the late 1950s and sold its first PDP-1 computer in 1960, for one hundred and twenty thousand dollars, produced in an old wool mill in Maynard, Massachusetts! It became an industrial titan of the era, and in the end it fell mightily. DEC started by concentrating on computers as modular components of laboratory and industrial equipment. The company rose through the 1960s to rule the minicomputer world for two decades, becoming similar in size to IBM. Mainframe technology grew ever hotter in its airconditioned machine rooms. DEC failed to metamorphose to match the business model of the emerging microcomputer world and was outsmarted by rival IBM with its IBM PC. DEC’s final demise was to be taken over by Compaq, in 1998, and Compaq by Hewlett-Packard in 2002. This corporation subsequently acquired, and then became embattled with the Cambridge founder of Autonomy—the company that accelerated the industry into machine intelligence, based on analysis of unstructured data.

25 John was later appointed to run the Imperial College Medical School, working with Chris Edwards, a consultant endocrinologist and researcher at Bart’s, in my early years there. Chris was at the time Head of that Medical School and was subsequently Vice-Chancellor of Newcastle University and a Wellcome Trust Trustee. Years later, he invited me to join in with an interesting initiative, Planet Earth, that he was leading, focused on improving utilities of health, shelter, water and environment for Africa. There, I met amazing innovators like Magdi Jacoub, the cardiac surgeon, busy supporting a new research institute in his native, Egypt. The Planet Earth initiative sadly collapsed, due to insurmountable problems with its operations.

26 D. Black, ‘Data for Management: The Körner Report’, BMJ (Clin Res Ed), 285 (1982), 1227–28, https://doi.org/10.1136/bmj.285.6350.1227

27 L. A. Pray, ‘Discovery of DNA Structure and Function: Watson and Crick’, Nature Education, 1.1 (2008), 100, https://www.nature.com/scitable/topicpage/discovery-of-dna-structure-and-function-watson-397/

28 J. Weizenbaum, Computer Power and Human Reason: From Judgment to Calculation (Harmondsworth: Penguin Books, 1993).

29 Ibid., p. 18.

30 Ibid., p. 206.

31 Ibid., p. 207.

32 Ibid., p. 255.

33 Ibid., p. 259.

34 Ibid., p. 265.

35 Ibid., p. 271.

36 I. Illich, Deschooling Society (London: Calder & Boyars, 1971); I. Illich, Limits to Medicine: Medical Nemesis: The Expropriation of Health (London: Boyars, 1995).

37 Illich, Limits to Medicine: Medical Nemesis, p. 11.

38 Ibid., p. 166.

39 Ibid., p. 116.

40 Ibid., p. 161.

41 Ibid., pp. 166–67.

42 Ibid., p. 162.

43 Ibid., p. 163.

44 J. Monod, Chance and Necessity: An Essay on the Natural Philosophy of Modern Biology (New York: Knopf, 1971).

45 A. N. Whitehead, The Aims of Education and Other Essays (New York: Macmillan, 1929), p. 4.

46 A. N. Whitehead, ‘Universities and their Function’, Bulletin of the American Association of University Professors (1915–1955), 14.6 (1928), 448–50 (p. 450).

47 J. Anderson and A. Graham, ‘A Problem in Medical Education: Is There an Information Overload?’, Medical Education, 14.1 (1980), 4–7, https://doi.org/10.1111/j.1365-2923.1980.tb02604.x

48 Assessments of the quality of craft and profession are inevitably quite subjective. I remember vividly an art class exam in secondary school where we were asked to paint a gloomy, wintry riverside scene, with a wood of trees dangling roots into murky water. I ended with a picture I still have that was, in my eyes, a blurry, wet disaster. Coming to class, feeling trepidation about having the marked work given back, I was shocked to see ninety-eight percent written at the top and hear the genial art teacher hold it up and expound to the class its artistic merits! Art is about conveying meaning and feeling as well as technical mastery, and that involves impact on other people as much as oneself. I still look at the picture and think it was terrible!

49 ‘Report of the Public Inquiry into Children’s Heart Surgery at the Bristol Royal Infirmary’, The Health Foundation (18 July 2001), https://navigator.health.org.uk/theme/report-public-inquiry-childrens-heart-surgery-bristol-royal-infirmary

50 C. Vincent, ed., Clinical Risk Management: Enhancing Patient Safety, 2nd ed. (London: BMJ, 2001).

51 Congress of the United States Office of Technology Assessment, Computer Technology in Medical Education and Assessment (Washington, DC: Congress of the United States Office of Technology Assessment, 1979), https://www.princeton.edu/~ota/disk3/1979/7903/7903.PDF, p. 5.

52 Ibid., pp. 5–6.

53 My involvement in creating and running research computing environments and infrastructures extended over many disciplines. I was for several years given the responsibility to lead and coordinate the IT professionals working in separate computer support teams on the three main medical school campuses and the separate biomedicine research institutes of UCL in London. These comprise some fifty percent of the volume of academic work of UCL, as is typical of world-ranking Universities. I also chaired the university IT infrastructure committee and was a member of its information strategy and finance committees, and of the biomedical executive committee of the University. Further afield, I was at different times a member of boards overseeing research and library computing for the MRC, EPSRC, EESRC, CCLRC and for the British Library and Wellcome Trust.

54 This initiative was mentioned in Chapter Five, in the context of a meeting to discuss a consultant’s report on the anarchic disorganization of some government databases, that Al had asked me to attend with him, in his later capacity of national Children’s Commissioner.

55 Under its then new leader, Robert Naylor, the UCLH Trust recruited Paul Bate from Chris Ham’s department of health care management in Birmingham, who specialized in organizational change and came to run the Organizational Development programme of the UCLH Trust. This Trust was itself engaged in pulling together several other local NHS Trusts under the one UCLH umbrella. Paul and his colleague, Glenn Roberts, joined our department and conducted several research projects across the NHS and internationally, including one studying the organizational impact of IT innovations. I arranged for Paul to teach what became a very much appreciated module on this topic for our very successful health informatics graduate programme, led at that time by Paul Taylor.

56 W. C. Sellar and R. J. Yeatman, 1066 and All That. A Memorable History of England Comprising, All the Parts You Can Remember Including One Hundred and One Good Things, Five Bad Kings, and Two Genuine Dates (London: Methuen, 1930), p. 116.

57 H. W. J. Rittel and M. M. Webber, ‘Dilemmas in a General Theory of Planning’, Policy Sciences, 4.2 (1973), 155–69, https://www.jstor.org/stable/4531523

58 M. C. Escher, ‘Waterfall’, Digital Commonwealth, https://ark.digitalcommonwealth.org/ark:/50959/3r076s93c

59 E. Schrödinger, What Is Life? (Cambridge, UK: University Press, 1948), p. 76.

60 D. Shirres, ‘Digital Delusion: A Lesson from Not-so-long Ago’, Rail Engineer (3 September 2018), https://www.railengineer.co.uk/digital-delusion-a-lesson-from-not-so-long-ago

61 T. Justinia, ‘The UK’s National Programme for IT: Why Was It Dismantled?’, Health Services Management Research, 30.1 (2017), 2–9.

62 In What Do You Care What Other People Think? (New York: Bantam, 1989), Richard Feynman (1918–88) unravelled the sequence of decision making that led to the inappropriate rocket seals that caused the explosion on take-off.

63 W. Raleigh, The Works of Sir Walter Raleigh (New York: Burt Franklin, 1966), p. lxiii.

64 Raleigh was an on-off favourite of Elizabeth I and, subsequently, an inconvenient bête noire of James I.

65 This section of the book was written in what seems, as I finalize it, a rather outspokenly critical and direct manner. It perhaps reflects a time when a highly emotional personal struggle with mitigation of the lack of connected and accessible health care records of my wife, in a critical care context, was still deeply infused in my mind. I have chosen to leave it expressed in this vein as it is authentic of how the failure of information policy to achieve coherently connected care records can deeply affect us all. It is offered with constructive intent and without recrimination or rancour.

66 R. Smith, ‘The Future of Healthcare Systems’, BMJ, 24.314 (1997), 1495–96, https://doi.org/10.1136/bmj.314.7093

67 P. Tillich, The Shaking of the Foundations (New York: Charles Scribner’s Sons, 1953).

68 The Economist, 28 February 1998.

69 F. M. Wilson, In the Margins of Chaos: Recollections of Relief Work in and between Three Wars (New York: Macmillan, 1945).

70 S. Kliff, ‘Obama’s Surprising Answer on Which Part of Obamacare Has Disappointed Him the Most’, Vox (9 January 2017), https://www.vox.com/2017/1/9/14211778/obama-electronic-medical-records; S. Pipes, ‘Electronic Health Records Are Broken’, Forbes (28 May 2019), https://www.forbes.com/sites/sallypipes/2019/05/28/electronic-health-records-are-broken/?sh=14ad868d546a

71 I am quoting, here, from paper documents in my archive and some have now proved impossible to trace online. These thus lack accessible source citations, but are, nevertheless, important to include as illustrative of the fifty-year timeline.

73 Nuffield Provincial Hospitals Trust, The Flow of Medical Information in Hospitals (London: Oxford University Press, 1967).

74 Ibid., p. 5.

75 Congress of the United States Office of Technology Assessment, Policy Implications of Medical Information Systems (Washington, DC: Congress of the United States Office of Technology Assessment, 1977), https://www.princeton.edu/~ota/disk3/1977/7708/7708.PDF

76 Congress of the United States Office of Technology Assessment, Coastal Effects of Offshore Energy Systems (Washington, DC: Congress of the United States Office of Technology Assessment, 1976), p. 4.

77 Congress of US OTA, Policy Implications, p. v.

78 Ibid., p. 4.

79 Ibid., p. 4.

80 Ibid., p. 5.

81 Ibid., p. 7.

82 Ibid.

83 Ibid., p.12.

84 Ibid., p. 14.

85 Ibid., p. 12.

86 Ibid., p. 14.

87 Ibid., p. 12.

88 Ibid., p. 6.

89 J. Walker, E. Pan, D. Johnston, J. Adler-Milstein, S. W. Bates and B. Middleton, ‘The Value of Health Care Information Exchange and Interoperability: There Is a Business Case to Be Made for Spending Money on a Fully Standardized Nationwide System’, Health Affairs, 24.Suppl1 (2005), W5-10-W5-18, https://doi.org/10.1377/hlthaff.W5.10

90 M. Fairey, A National Strategic Framework for Information Management in Hospital and Community Health Services (London: DHSS, 1986), foreword (n.p.).

91 NHS Information Management Centre, Introduction to the Common Basic Specification (Birmingham, UK: NHS Information Management Centre, 1990), p. 1.

92 Secretary of State for Health, The Health of the Nation: A Consultative Document for Health in England (London: HMSO, 1991).

93 Ibid., p. 7.

94 General Medical Council, Tomorrow’s Doctors: Recommendations on Undergraduate Medical Education (London: GMC, 1993).

95 A. Wyke, ‘Peering into 2010: A Survey of the Future of Medicine’, The Economist (19 March 1994), 1–20.

96 Ibid., p. 2.

97 It is hard not to descend somewhat into ranting in appraising this situation in retrospect, here. Given the extent of clear failure to achieve goal after goal, plan after plan, billion after billion of resource, decade after decade of effort, litigation after litigation of medical error, the ‘how’ of digitizing health and care records has been an Emperor’s clothes disaster of confabulation, collusion, justification and shame! Rather than blushing in the face of clear evidence, and attending to the repair of modesty, Emperors of today continue with a mixture of hubris, denial and obfuscation. Why has all this failed, repetitively, for so long? What have we learned from this failure? How should that learning reflect in how we proceed from here? I have, for thirty years, since 1990, helped build and lead an international community, across health care, academia and industry, which has devised, worked on, implemented, demonstrated and made openly and freely accessible, in the Creative Commons, a radically different methodology for tackling the how question. This is the subject of Chapters Eight and Eight and a Half. End of rant! No judgemental or malign intent towards anyone, but this has been hugely consequential systemic failure.

98 Little and Large were a comedy double act some years ago—one rather little and one rather large. Little was the sharper brain of the duo and large the brawnier—there are echoes of little and large (Big!) data, here!

99 Wyke, ‘Peering into 2010’, p. 17.

100 Ibid., p. 18.

101 Ibid., p. 18.

102 Ibid., p. 18.

103 Audit Commission, Setting the Records Straight: A Study of Hospital Medical Records (London: HMSO, 1995).

104 NHS Executive, Seeing the Wood, Sparing the Trees. Efficiency Scrutiny into the Burdens of Paperwork in NHS Trusts and Health Authorities (London: HMSO, 1996).

105 Ibid., para. 223.

106 Smith, ‘Future of Healthcare Systems’, 1495–96.

107 As an aside, I see that on p. 1559 of the same issue was a personal column by Trish Greenhalgh, a kindred radical spirit of the Editor, reflecting on corporate speak about health care services and their management, and the rather simpler and more important things that matter to patients—being listened to, having problems sorted out and receiving continuity of care. I got to know her when she brought her team onto the same floor as CHIME, at UCL; we respected and valued her courage and energy. With self-effacing humour, she touted her new image accompanying the piece—new hairstyle and airbrushed detail to distract and conceal anno domini, she said! She used this as a clever and wickedly rebellious, as ever, allusion to the corporate speak that she observed in contemporary times, which she described as buffing a self-image of general practice that was unrecognizable in the experience of many such practitioners, who are, nonetheless, she observed, rather good at meeting the simple basic needs of their patients.

108 Department of Health, The New NHS: Modern, Dependable (London: The Stationery Office, 1997); Department of Health, Our Healthier Nation: A Contract for Health (London: The Stationery Office, 1998).

109 Department of Health, A First Class Service: Quality in the New NHS (London: The Stationery Office, 1998).

110 F. Burns, Information for Health, An Information Strategy for the Modern NHS 1998–2005 (London: NHS Executive, 1998).

111 Ibid., p. 9.

112 Ibid., p. 15.

113 Ibid., p. 25.

114 Ibid..

115 Ibid..

116 Ibid., p. 26.

117 Ibid., p. 27.

118 Ibid., p. 28.

119 Ibid., p. 29.

120 Ibid., p. 32.

121 Ibid., p. 36.

122 Ibid., p. 38.

123 Ibid., p. 39.

124 Ibid., pp. 46–47.

125 Ibid., p. 48.

126 I met Alan Duncan McNeil, who had headed the NHS NPfIT Design Authority, some years after, when he was running the IHTSDO and Martin Severs and I were seeking to align openEHR and SNOMED, conceptually and organizationally. He was impressed and in favour of this, I understood, but the quest succumbed in an international cauldron of ‘argee-bargee’, like that which boiled the waters of IfH/NPfIT/CfH care record integration, nationally. The latter quest has continued to bubble in subsequent NHS-X, and now NHS England, waters. The structures have changed again, twice, during the writing of this book! And the new industry saviour in favour looks to be Palantir. Zobaczymy [we will see]!

127 F. P. Brooks Jr., The Mythical Man-Month: Essays on Software Engineering (New Delhi: Pearson Education, 1995).

128 D. Wanless, Securing Our Future Health: Taking a Long-Term View (London: HM Treasury, 2002).

129 Ibid., p. 4.

130 Ibid., p. 67.

131 Ibid., p. 14.

132 Ibid., p. 15.

133 Ibid., p. 16.

134 Ibid., p. 18.

135 Ibid., p. 19.

136 Ibid., p. 21.

137 Ibid., section 2.35.

138 Ibid., p. 50.

139 Ibid., p. 53.

140 Ibid., pp. 55–56.

141 Ibid., p. 56.

142 The Office of National Statistics reported total current health care expenditure in 2020 as two hundred and sixty-nine billion pounds, representing 12.8 percent of GDP. J. Cooper, ‘Healthcare Expenditure, UK Health Accounts Provisional Estimates: 2020’, ONS (1 June 2021), https://www.ons.gov.uk/peoplepopulationandcommunity/healthandsocialcare/healthcaresystem/bulletins/healthcareexpenditureukhealthaccountsprovisionalestimates/2020

143 Wanless, Securing Our Future, p. 78.

144 Ibid., p. 92.

145 Ibid., p. 93.

146 Ibid., p. 97.

147 Ibid., p. 98.

148 Ibid., p. 101.

149 Ibid., p. 102.

150 Ibid., pp. 105–08.

151 Ibid., p. 110.

152 Ibid., p. 115.

153 Ibid., p. 117.

154 Ibid, pp. 123, 127–28.

155 Ibid., pp. 145, 148.

156 Department of Health, Delivering 21st Century IT Support for the NHS: National Specification for Integrated Care Records Service (London: Department of Health, 2002).

157 S. Leatherman and K. Sutherland, The Quest for Quality in the NHS: A Mid-term Evaluation of the Ten-year Quality Agenda (London: The Stationery Office, 2003).

158 Ibid., p. 44.

159 Ibid., p. 267.

160 Institute of Medicine, Crossing the Quality Chasm: A New Health System for the 21st Century (Washington, DC: The National Academies Press, 2001); Institute of Medicine, Computer-Based Patient Record: An Essential Technology for Health Care (Washington, DC: The National Academies Press, 1991).

161 D. Protti, World View Reports (London: NHS CFH Press, 2005).

162 V. Stroetmann, J.-P. Thierry, K. Stroetmann and A. Dobrev, eHealth for Satefy: Impact of ICT on Patient Safety and Risk Management (Luxembourg: Office for Official Publications of the European Communications, 2007).

163 R. Wachter, Making IT Work: Harnessing the Power of Health Information Technology to Improve Care in England (London: Department of Health, 2016), p. 58.

164 Ibid., p. 6.

165 Ibid.

166 Black, ‘Data for Management’, 1227–28.

167 E. Topol, The Topol Review: Preparing the Healthcare Workforce to Deliver the Digital Future (London: Nati onal Health Service, 2019), p. 12, https://topol.hee.nhs.uk/wp-content/uploads/HEE-Topol-Review-2019.pdf

168 Ibid., pp. 3–5.

169 Ibid., p. 54.

170 M. Marmot, Fair Society, Healthy Lives: The Marmot Review: Strategic Review of Health Inequalities in England Post-2010 (London: Marmot Review, 2010); M. Marmot, ‘Health Equity in England: The Marmot Review 10 Years On’, BMJ, 368 (2020), m693, https://doi.org/10.1136/bmj.m693

171 Horace (65 BCE–8 BCE), Ars Poetica, l. 138.

172 I. Stewart, Life’s Other Secret: The New Mathematics of the Living World (New York: John Wiley and Sons, 1998).

173 J. A. Wheeler, ‘Information, Physics, Quantum: The Search for Links’, in Feynman and Computation, ed. by A. Hey (Boca Raton, FL: CRC Press, 2018), pp. 309–36, https://doi.org/10.1201/9780429500459-19

174 Congress of US OTA, Policy Implications, pp. 4, 5, 7, 12, 14.

175 Ibid., p. 6.

176 Ibid., p. 7.

177 I. Illich, Limits to Medicine: Medical Nemesis: The Expropriation of Health (London: Boyars, 1995).

178 I. McEwan, Machines like Me (Toronto: Knopf Canada, 2019); K. Ishiguro, Klara and the Sun (New York: Knopf, 2021).

179 M. King, The End of Alchemy: Money, Banking and the Future of the Global Economy (New York: W. W. Norton and Company, 2016).

180 D. Sandbrook and T. Holland, The Rest is History (2020–), https://www.goalhangerpodcasts.com/lineker-and-baker-copy

181 On the Aboriginal concept of the Dreaming, see Preface.

Powered by Epublius