Photogrammetry with Beans
Tags: art-and-design, hypertext-art-installations.
Or, how to turn a bunch of photos into a publicly visible 3D model.
I’ve been participating in a weekly call with a group of old friends1 since April 2020. This was at the start of COVID and we didn’t have much exciting personal news to report from week to week, so we started assigning ourselves weekly challenges that would provide a topic of conversation.
Those challenges started out as concrete tasks, like paint along with Bob Ross or write a haiku. But they’ve gradually slid into more absurdly impressionistic projects like balance or the intriguingly ambiguous guess the oyster.
This week’s project was to photograph beans.
If you take a bunch of photos of an object from different angles, it’s possible to use techniques from photogrammetry to stitch them together again in such a way as to construct a 3D model of the object. My partner Jenny did this a few years ago with video game screenshots to reconstruct character models for 3D printing, but we somehow never tried it with real-world objects!
I ran across the idea of photogrammetry again this week and thought it’d be a fun way to satisfy the weekly challenge. After a few false starts with various tools and methods, I found a pipeline that can get you from “a few dozen photos” to “3D model embedded on a Web site” within an hour or so. Once I figured out the right tools the whole process was surprisingly simple and straightforward!
First, install Metashape (the 30-day trial is sufficient) and create a free account on SketchFab. Metashape generates the model and texture map from the images, and SketchFab hosts it and provides a widget you can embed.
Once you’ve got your software and accounts set up:
- Take a bunch of photos from different angles. Try to get into all the nooks and crannies. I took between 30 and 60 photos for each of the objects on this page.
- Import the photos into a new Metashape project and follow the instruction in the tutorial.
- Enter your SketchFab API key into Metashape and upload directly from the application. I found this much easier than faffing about exporting texture maps, but of course you can also export your model into numerous formats.
- Embed the SketchFab widget into the HTML of your page.
This process fails hilariously in certain situations. Photogrammetric software works, more or less, by identifying interesting points in each photo, finding the same points in other photos, and using those correspondences to figure out where those points must be in 3D space. Identifying those interesting points correctly seems to work best when:
- Different sides of an object are visually distinct, with minimal symmetry,
- surfaces are patterned, or at least textured, and not reflective, and
- all objects in the scene are entirely opaque.
My first attempt, a glass measuring cup full of beans, violated all those rules and left me with fittingly horrific results:
Next I tried a ceramic bowl full of beans, but the symmetry of the bowl and the bean-pile seemed to make different angles difficult to distinguish and I ended up with only a “slice” of the object:
My coffee grinder (with coffee beans!) turned out relatively well, but the smooth black plastic wasn’t quite matte or patterned enough and the resulting model has a Dali-esque melting quality. The transparent plastic drawer in the front also, uh, went wrong:
The original project to photograph beans yielded mixed results. But I’ve since created some more successful models (like the boots above), and it was fun figuring out how to produce a model!
Well, a bunch of such calls, actually. Kind of a silver lining from this whole ordeal! ↩
You might like these textually similar articles: