Audience Testing of the Intu version of the mixed reality experiences at the Natural History Museum

Factory 42’s original proposal to the Audiences of the Future programme envisaged that the mixed reality experiences designed for the Natural History Museum and the Science Museum should also be experienced in shopping centres. This proposal raised fascinating questions during the research and development phase of the project as to the design of the experience in a non-museum space, both in terms of its content and structure, and in terms of its layout and learning outcomes. Detailed attention was paid to Intu’s research into their own audiences, including as to how ecom, computer vision and AI are changing shopping experiences, and this, alongside recommendations by the two museums audience research teams, determined decisions about the length of the experience, its content, structure, format, and pricing.

Intu is the largest shopping network in the UK with over 22 m2 ft of retail. The leisure space and the site put forward for the experience, the Intu Metrocentre in Dunston near Newcastle, is one of the largest city centre shopping destinations in the UK with more than 370 shops, making it the second largest shopping centre in the UK. The shopping centre has an average dwell time of 1 hour and it is anticipated that the provision of an exciting mixed reality experience might prolong that and also offer a flavour of the longer and more complex London versions. Interestingly, while the setting up of museum experiences in a shopping centre, as well as in other non-museum spaces, such as airports, is now a fairly common practice, especially in the USA and in Europe, rarely do they involve interactive and immersive technologies or performative events.

It was decided that for the Intu shopping centre, the two museum experiences, focussing on dinosaurs and robots respectively, would be built side by side. This was largely to maximise marketing and visitor engagement within the confines brought on by this particular environment. Despite the fact the experience was designed separately from the other two museum experiences, and that it was meant to be a shorter version of it, it was considered crucial that it should align with both museums’ brands and that the content should be curated with a learning perspective in mind.

Introductory signage at Natural History Museum

The design of the experience was finalised in a theme book in late November 2019 and early testing provided crucial insights for the other versions. Usability testing was completed in January 2020 and the experience was tested on a larger scale at the Natural History Museum Jerwood Gallery in February 2020 before the early March launch at the Metrocentre. The testing was done behind closed doors, but many members of the general public became intrigued by the sign placed in front of the gallery and hovered outside trying to have a peak into the gallery and find out what the experience consisted of and when it would open to them.
The experience takes place within a black box type of set that will eventually be positioned inside the shopping centre. The first section of the set was built to host the on-boarding for each of the experiences and saw staff help audiences putting on the headsets and receive basic instructions about the experience.
This part of a mixed reality experience has been described by Steve Benford and other staff from the Mixed Reality Lab at the University of Nottingham as a form of orchestration aimed at, quite literally, channelling people down a particular path or trajectory with the real challenge being, as Martin Flintham noted, ‘the art of guiding or shaping an experience as it unfolds’ (in Benford and Giannachi 2011: 209, added emphasis). Orchestration is often pre-designed or pre-scripted and continuously evolving as participants in a mixed reality work, usually due to the complexity of the overlay of physical and digital components, often depart from any preferred trajectory of the experience. For Flintham in fact orchestration can be divided in different phases including:
  • Monitoring, which involves looking at what participants are doing, and anticipating what they might do next, so as to alert both behind the scenes crews and participants to possible trouble-shootings;
  • Intervention, which involves intervening into an experience to respond to problems as they occur and potentially bring participants back onto a preferred trajectory within the engagement;
  • Improvisation, which involves adding personalised detail to a participant’s experience;
  • Induction, or on-boarding, which involves leading participants into the experience, for example by fitting them with a technology, and ensuring that they understand the rules of the game or experience.
  • Distributed orchestration, which is about the sharing of the observation with other members of the team who may have noticed different aspects of their behaviour (in Benford and Giannachi 2011: 2010; see also Crabtree et al. 2004; Schnädelbach et al. 2008).

Companies like Blast Theory have made extensive use of orchestrators, and museums whose exhibitions have a technology focus have also often made use of staff in this capacity. Thus, famously, the San Francisco Exploratorium, for example, makes use of what they call explainers, trained and experienced educators who lead demonstrations and are particularly valuable in assisting key target audiences to step into each of the stories at the heart of their interactive installations and experiences.

Member of the public filling in feedback evaluation form

The testing of the work produced some amazing feedback about the Intu experience and the use of the Magic Leap headset, and staff at Factory 42, the Almeida, the two museums, and Magic Leap are now very busy at using that feedback to design the last iteration of the work for the launch at the Intu shopping centre. What is fascinating in this era of increased automatisation is that it is a human dimension, and one developed through the relationship between the orchestrators and the audience, that revealed itself in the testing to be the most exciting area to explore in the next development of the project.


  • Benford, S., and Giannachi, G. (2011) Performing Mixed Reality, Cambridge, Mass.: The MIT Press.
  • Crabtree, A., S. Benford, T. Rodden, C. Greenhalgh, M. Flintham, R. Anastasi, A. Drozd, M. Adams, J. Row Farr, N. Tandavanitj, and A. Steed. 2004. Orchestrating a mixed reality game “on the ground.” In Proceedings of the 2004 CHI Conference on Human Factors in Computing Systems, 391–398. Vienna: ACM Press.
  • Intu verified 16/2/2020.
  • Oppenheimer, F. (2012) Origin of the Explainer Programme,, verified 18/2/2020.
  • Schnädelbach, H., S. R. Egglestone, S. Reeves, S. Benford, B. Walker, and M. Wright. 2008. Performing thrill: Designing telemetry systems and spectator interfaces for amusement rides. In CHI 2008: ACM Proceedings, 1167–1176. Florence: ACM.

Leave a comment

Your email address will not be published. Required fields are marked *