Photogrammetry

Speaking at Heritage for the Future: East Midlands

On May 23rd, Bethany will be speaking at the Heritage for the Future: East Midlands Exhibition. She will discuss how digital technologies are helping to bring heritage into the future and how integrating technologies like laser scanning, photogrammetry, game engines and XR can help modernise the heritage experience, reveal information lost to time and engage broader audiences.

Heritage Lincolnshire, in partnership with Heritage Trust Network, is putting on the event “to encourage local businesses, community groups, projects and individuals within the heritage industry to find out more about saving their local heritage and how they can support local community groups and organisations.”

If you are interested in attending, tickets can be found on Eventbrite

 

Case study combining laser scanning + RTI: Oswald Allen Memorial

Deciphering the worn inscription on the Oswald Allen memorial was an interesting challenge in heritage investigation. This Victorian ‘medicine pot’ memorial was believed to pay tribute to Oswald Allen, founder of the York Dispensary, and his wife Frances. However, despite a Victorian record of the stones within the St Lawrence churchyard, no record of Allen’s memorial inscription can be found and therefore its message is at risk to be lost to history forever. This is why a parishioner of St Lawrence Church approached us to see if digital technologies might be able to decipher anything beyond what is visible to the naked eye.

Determined to at least partially decipher the inscription, we tried several different methods. First a photogrammetry test was done to see if this technique would pick up the required level of detail from the stone’s surface to allow us to see fine depth information. The test didn’t prove promising unless new techniques were engaged including high resolution, macro photography or photometric stereo. Instead, our second method was to try laser scanning and to see if the level of detail was enough to determine small changes in depth.

We were provided assistance with the laser scanning from XR Stories and from SIGN’s Creativity Lab, both very useful resources available to small business in the screen and interactive media realm in the Yorkshire region. With the stone scanned, we first had to flatten it in order to be able to use depth information to help us potentially interpret some of the inscription. To do this, we used software called CloudCompare. Once flattened, we were able to add a colouring system which changes colours based on height information. This helped us to see a little more, but the first scans did not obtain enough fine detail to allow us to see much.

Next, we took on a new technique (for us) called Reflectance Transformation Imaging (RTI). Different to photogrammetry, where the camera is moved around an object, in RTI it is the flash that is moved while the camera remains in the same place. After a series of images are taken having moved the flash in a configuration as shown in the diagram below, processing software pulls these images together to give you further depth information in the form of a normal map. A normal map shows height information for a 3D model and can be used to provide texture to a flat model surface. The RTI viewer software also allows you to digitally move a light source around the object to help highlight depth information through shadows. 

A normal map produced through RTI

Placement of light/flash during RTI.
Image credit: https://cceh.github.io/rti/intro.html

We then combined the two techniques by performing RTI on the laser scanned model within the 3D modeling software.

Using these techniques, as well as good old-fashioned raking light (manually using light over the surface of the object to induce shadows which help read inscriptions) and research, we were able to provide more of the inscription than was previously known.

In the end, however, much of the middle of the two panels was indecipherable. The weathering had taken most of the detail of the letters away leaving only a hint that a letter was once there.

Depth colouring

Laser scan, flattened and shaded

Deciphered sections from the inscription

 

Case study combining SfM + standard photogrammetry for virtual touring: Parlormade

The project for Parlormade Scone House was an exciting opportunity to combine our work in photogrammetry with drone SfM (Structure-from-Motion). It also had the increased challenge of capturing narrow in-door spaces through standard photogrammetry.

We broke the project into pieces, deciding to combine the exterior model and interior model in post. For the exterior, we took ground photography to try to get as much detail as possible, realizing that we’d not be able to get the angle we’d need to capture the roof without using a UAV (unmanned aerial vehicle) because of how tightly packed the buildings are in the Shambles. Therefore, we used the drone to capture the upper angles of our building and those next to it. Because of the extreme difference between our ground angles and those from the drone, the software was not able to automatically combine these pictures into one model, so it required manual merging in 3DS Max. After much cleaning and post processing, we had the exterior completed and then shifted our focus on creating the interior.  

Creating photogrammetry models of interior spaces is more complicated than it may seem. Typically, in photogrammetry, you move in a circular path around your subject capturing it from all angles. However, when your subject surrounds you, in the case of a room, you have to change the way you think about capturing every surface. Our brilliant intern Ben, who is also a skilled photographer, did a large portion of the interior photography. He captured each wall as if it were its own ‘object’, but made sure to tie it to the adjacent wall through sufficient overlapping photography.

Blue squares represent the camera positions within the model.

Piecing things together in post-production

However, some features will never do well, whether taken inside or out, and those are reflective surfaces like mirrors or windows. This is because the current software isn’t yet smart enough to figure out what a window is and how it should work when being modeled. Sunlight reflecting off surfaces like shiny wood can also cause distortion which is why a lot of the second floor chairs and table-tops needed a lot of reworking. 

An amazing amount of information can be captured from the right photography. When looking at the second floor, you can see how well the high ceiling came out despite us not using a ladder to get closer shots, straight-on shots.

Wireframe of the completed model

The other complication with interior photogrammetry in a space like Parlormade is all the furniture that needs to be included and needs sufficient photographic coverage as well. Much of the building's internal features, such as the counter on the ground floor and the fireplaces, were captured easily enough. Features with spindly details like the chairs’ legs and the railing balusters did less well and had to be fixed in post. In fact, most of the chairs in the final model are a bit of a Frankenstein’s monster created by merging what was captured from different chairs into one jumble that resembles something somewhat realistic. 

Though an attempt was made to cover each space thoroughly enough to be able to process all three floors into one model in the software, the challenge of narrow and dark staircases made this impossible. In the end, a lot of post processing work went into seamlessly merging different models into the single interior model for the final design.

Once the model was complete, it was uploaded to Sketchfab where customers/potential customers of Parlormade could view the building and move around it in the 3D viewer. We were asked to add some historical information to add context to the medieval structure. We completed a few days of research and came up with 10 ‘stops’ around the model to help viewers better understand the history of the building, the Shambles and the tea and scone industry in the UK. The final result acts like a virtual tour which users can view while enjoying a scone in the cafe or while sitting at home deciding where they’d like to visit. Through the Sketchfab platform, users can also utilise the VR viewing option for a fully immersive experience.

This project combined a lot of photography and planning with quite a bit of post processing in order to pull together an engaging and immersive model. The end result, with photo-realistic quality, gives users the ability to move around the space virtually and view each room from any angle.

Photogrammetry for 3D printing

We recently worked on this Buddha statue project, creating a digital 3D model through the process of photogrammetry and then preparing it for 3D printing by the clients. Check out the different phases in this short video.

From the photographs taken, a point cloud is first created in Metashape, followed by the mesh building built, and finally the texture is added. Then we take a look at the model in a standard 3D printing software to see how it will do in the printing process.

Landscape photogrammetry and AR

This reconstruction model of Skipsea Castle shows how we can combine drone photogrammetry and classic 3D modeling from research to get a more accurately placed model. We can then remove the landscape and use the model in an AR overlay which should then accurately appear on top of the existing Skipsea mound where the castle once stood. The AR here was tested at York Minster as we've not yet made it out to the mound to test it out.

Panelist & Demo - Intro to Virtual Production Event

On 25 January, 2022, Bethany will take part of the Introduction to Virtual Production event taking place at XPLOR / Production Park in Pontefract. There will be two panels as part of the event - Unlocking Creativity and Production Management & Sustainability. She will take part in the Unlocking Creativity panel as well as demonstrate the process of photogrammetry in a demo station accompanying the event.

This event, put on by Screen Industries Growth Network (SIGN), sister project XR Stories, and industry partner, XPLOR, “will demystify virtual production and delve into the creative and business opportunities it brings”.

The event will:

  • Explore how virtual production can unlock creativity

  • Examine the software and hardware tools used in virtual production

  • Consider how traditional workflows can be complemented with these new techniques

  • Investigate how remote production can facilitate collaboration

  • Explore the impact of these technologies on audiences of the future

  • Understand how virtual production can contribute to environmental and financial sustainability

Other speakers include:

Get your FREE tickets to this event now!

Date and time
Tue, 25 January 2022
09:00 – 15:00 GMT

Location
XPLOR
Production Park
Langthwaite Grange Ind Estate
South Kirby
WF9 3NR

Moving forward in digital heritage

AR demo2.png

The following is a guest article written by Owen Burton for and originally posted in Visitor Focus on the Association of Independent Museums website.

Working in digital heritage is exciting: as technology improves all the time, the possibilities of what can be achieved continue to grow. Here AIM Associate Supplier Experience Heritage explore some of the best methods for heritage sites to produce engaging displays while managing the impact of social distancing.

Possibilities with Photogrammetry and Augmented Reality
Photogrammetry (creating a 3D model by stringing together a group of photographs) makes it possible for heritage sites to have pieces of their collection accessible online. People can interact with these objects in a new way and from different angles, all from the comfort of their own living room. In these challenging times, it might be possible for heritage site staff to be taught how to take the required photos of a given object or even how to use the modelling software for themselves.

Augmented Reality mobile apps can place historic reconstructions of sites over the current landscape to enable the public to visualise what used to be there, providing opportunities for storytelling to help bring inaccessible sites to life.

Mobile apps and virtual tours
There is an increasing awareness of the possibilities of heritage trail and self-guided tour mobile apps another opportunity for interactive engagement with history while maintaining social distancing. Virtual tours have allowed digital access to sites that have been closed during lockdown, as people have been virtually wandering around such sites as the British Museum, the Louvre and the Van Gogh Museum.

Pause for thought . . . and communication
Lockdown has provided us with opportunities to learn and space to reflect. Heritage roundtables and webinars have lent greater clarity to the day-to-day realities of what sites have been going through and what their focus points are. These priorities have included expanding audience engagement through digital opportunities and ensuring accessibility in digital communication for disabilities.

If you would like to explore possibilities for digital engagement, like photogrammetry, augmented reality or heritage trail apps, Experience Heritage would love to help. Visit our website at www.experience-heritage.com or email us at info@experience-heritage.com.

Pictured: Mockup imagining of an AR app for Slingsby Castle by Experience Heritage.

Meet the director of Experience Heritage, Bethany Watrous

Meet the director of Experience Heritage, Bethany Watrous

Director Bethany Watrous talks about her background in digital archaeology and heritage, and gives a behind-the-scenes insight into what the company does.