Vote for your favorite mineral in #MinCup25! - Silver vs. Baryte
Are you ready for beautiful utility as sparkling silver competes against hefty baryte?
Log InRegister
Quick Links : The Mindat ManualThe Rock H. Currier Digital LibraryMindat Newsletter [Free Download]
Home PageAbout MindatThe Mindat ManualHistory of MindatCopyright StatusWho We AreContact UsAdvertise on Mindat
Donate to MindatCorporate SponsorshipSponsor a PageSponsored PagesMindat AdvertisersAdvertise on Mindat
Learning CenterWhat is a mineral?The most common minerals on earthInformation for EducatorsMindat ArticlesThe ElementsThe Rock H. Currier Digital LibraryGeologic Time
Minerals by PropertiesMinerals by ChemistryAdvanced Locality SearchRandom MineralRandom LocalitySearch by minIDLocalities Near MeSearch ArticlesSearch GlossaryMore Search Options
Search For:
Mineral Name:
Locality Name:
Keyword(s):
 
The Mindat ManualAdd a New PhotoRate PhotosLocality Edit ReportCoordinate Completion ReportAdd Glossary Item
Mining CompaniesStatisticsUsersMineral MuseumsClubs & OrganizationsMineral Shows & EventsThe Mindat DirectoryDevice SettingsThe Mineral Quiz
Photo SearchPhoto GalleriesSearch by ColorNew Photos TodayNew Photos YesterdayMembers' Photo GalleriesPast Photo of the Day GalleryPhotography

Automated Field of View System

Last Updated: 15th May 2020

By Travis Olds



In this article I describe the construction of an automated field of view measurement tool for micromineral photographers using a bellows and plan apochromatic objectives. I first need to acknowledge Stephan Wolfsried for his suggestions on how to assemble a high-resolution microphotography system; the setup I use is based primarily on his suggestions and similar components outlined in several of his helpful articles[1]. However, the numerous adapters required to assemble the optical column makes the calculation of field of view difficult, and I was forced to take field of view measurements after each new bellows setting using a micrometer slide for an accurate result. This got tedious and time consuming, so I wanted to design something that could measure, in real time, the field of view (FOV) of any bellows distance used.

00050500017360509347780.jpg
Fig. 1 The automated FOV system.
04184980017360509375579.jpg
Fig. 2 Closeup of LCD and Uno.
00050500017360509347780.jpg
Fig. 1 The automated FOV system.
04184980017360509375579.jpg
Fig. 2 Closeup of LCD and Uno.
00050500017360509347780.jpg
Fig. 1 The automated FOV system.
04184980017360509375579.jpg
Fig. 2 Closeup of LCD and Uno.




The automated FOV system relies on a time-of-flight IR laser distance sensor to measure the bellows extension length. After an external calibration with a micrometer slide, the bellows distance is automatically converted and averaged via code with an Arduino Uno microcontroller to display quite precise (+/- ~0.02 mm) FOV on a LCD screen, updated every second with a rolling average. I have housed the LCD screen and Uno inside a 3D resin-printed case with ¼”-20 screw mount; this is not a necessary component, but I can provide details for its build if you have access to a 3D printer. Considering the cost of most photography equipment, the cost of this adaptation is relatively low (~$100 USD) and can be assembled and up-and-running in roughly 2 hours using these instructions. The Arduino sketch, an example Excel calibration spreadsheet, and .stl files for the optional case are available on github:

https://github.com/toldxls/Automated-FOV-system

*In this build the distance sensor works well on the flat surfaces of a Balpro1 bellows mount, however, I have not tested other bellows – if you use a different bellows it may require some modifications by side mounting the distance sensor. See Step 3 below for more details on how that might be done. Just by eye, the Nikon PB6 and some of the Fotodiox bellows may well work.


Parts list


I’ve listed some of the cheapest options below, however many can be easily substituted.
(prices are relevant ~May 2020)

Soldering kit ($20.95) https://www.amazon.com/Soldering-Iron-Kit-Temperature-Desoldering/dp/B07S61WT16
Arduino Uno R3 ($12.99) https://www.amazon.com/ELEGOO-Board-ATmega328P-ATMEGA16U2-Compliant/dp/B01EWOE0UU
LCD 16x2 shield ($19.95) https://www.adafruit.com/product/772
Breadboard Wires (or any wires really) (~$6) https://www.amazon.com/EDGELEC-Breadboard-Optional-Assorted-Multicolored/dp/B07GD2BWPY
VL1680X ($13.95) https://www.adafruit.com/product/3316
Scotch double sided tape ($7.99) https://www.amazon.com/Scotch-Removable-Dispenser-Standard-667/dp/B00006IF63
A reflective material, white tape/masking tape/label/paper/aluminum foil (~$5) https://www.amazon.com/Avery-All-Purpose-Labels-Inches-White/dp/B000BQOCRK
Micrometer slide (~$10) https://www.amazon.com/0-01mm-Micrometer-Microscope-Camera-Calibration/dp/B078QJHKFX
USB-A extension cable (2 m) ($3.99) https://www.amazon.com/dp/B00007FGU5


Step 1: Uno + LCD shield construction


Adafruit provides an easy to follow installation guide for their LCD shield, see the link below. It involves ~30 minutes of soldering several resistors, a potentiometer, an i2c chip, pin headers and pushbuttons. I did the soldering under a microscope. Note, other LCD “shields” have slightly different pinouts and the LCD itself can have slightly different positioning with respect to the underlying Uno board. Bear in mind that the 3D printed case I designed only fits with the Adafruit shield, however other printable LCD shield cases are available on Thingiverse.com (e.g. the shield from Sainsmart). Below is a photo of the fully assembled shield, including the necessary connections to the VL6180X sensor that will be made in the next step.

LCD shield assembly: https://learn.adafruit.com/rgb-lcd-shield/assembly

Step 2: Powering up and connecting the VL6180X to the Uno


The Uno can be powered via an USB-B cable, the sort used with printers, and additionally I use a 2 meter extension cable to connect it to my PC. NOTE: when first powered on, the LCD backlight will light up but nothing will display. The orange contrast potentiometer on the shield can be adjusted to display characters. With no code uploaded to the Uno, adjusting the potentiometer until boxes appear is close enough. In step 4 you will adjust the potentiometer to set the best contrast for characters on the display.

06453500017360509443768.jpg
Fig 3. The fully assembled LCD shield.

Assembling the VL6180X: https://learn.adafruit.com/adafruit-vl6180x-time-of-flight-micro-lidar-distance-sensor-breakout/assembly

The connections needed between the Uno and sensor are straightforward and are described in the Adafruit tutorial link above. I will summarize them briefly here. Strip the plastic from all ends of 4 multicolor breadboard wires, solder one wire to the 5V pin on the shield, and the other end of the same wire to the 5V pin on the VL6180X. Do the same for the ground, SCL and SDA pins, being sure to use the 2nd ground terminal on the shield (rightmost GND pin).

04495230017360509509039.jpg
Fig. 4 Wiring details for shield.
02387230017360509577919.jpg
Fig. 5 Wiring details for VL6180X.
04495230017360509509039.jpg
Fig. 4 Wiring details for shield.
02387230017360509577919.jpg
Fig. 5 Wiring details for VL6180X.
04495230017360509509039.jpg
Fig. 4 Wiring details for shield.
02387230017360509577919.jpg
Fig. 5 Wiring details for VL6180X.


Step 3: Mounting the VL6180X sensor and reflective surface


I mount the sensor onto the base of the frontmost inner mount of the Balpro1 using a small piece Scotch double-sided tape, with the laser emitting module and wires towards the bottom. On the Balpro1 there is a convenient gap to guide the sensor wires between the bellows mount and through the rail tracks, out to the Uno and LCD. The infrared laser of the distance sensor is emitted as a relatively narrow cone of light and I have found that my sensor has a useable range of ~10 to ~210 mm under low ambient light levels. In order to preserve its luminosity, it is important not to keep the laser module powered for long periods. After several months of use and several calibrations I have found that the minimum useable range has increased slightly to ~12 mm, with high nonlinearity for measurements below 12 mm. Be sure to remove the plastic covering on the black rectangular laser and detection module of the sensor before continuing - it must have an unobstructed path between the source and the opposing measurement surface, which in my case is the opposing inner edge of the bellows mount. Any obstacles, wires or other materials in the beam path will lead to non-linear or inaccurate distance measurements. On the opposite edge of the bellows mount that reflects the laser pulses I have placed a white sticker label to increase the reflective surface area; a piece of aluminum foil or reflective tape may work even better for this.

02382400017360509648359.jpg
Fig 6. The mounted VL6180X and reflective surface.


If you do not use the Balpro1 bellows and there is not sufficient clearance for the laser (if the baffles of the bellows interfere) then you may have to attach an extension/protrusion from each end of the bellows mounts, one for the laser sensor and one for the opposite measurement face. This might be done easily with two popsicle sticks and tape, but of course a stable and permanent mount (drilled & tapped holes) for the sensor and reflective face are ideal.

Step 4: Load the Arduino Sketch code to the Uno


I have written an Arduino sketch that captures distance data from the sensor and displays it on the LCD screen using a rolling average FOV calculation. You can download the code from the Github link below, or if you are familiar with sketch coding then perhaps you can write something better that suits your needs or your preferred microcontroller.

Download the Arduino IDE for your system from: https://www.arduino.cc/en/Main/Software

Download my sketch file (FOVcalibratedLCDaverage_m6mark2_20x.ino) from: https://github.com/toldxls/Automated-FOV-system

Once installed, the first step is to ensure that the Arduino IDE is connected to your Arduino. An easy way to test this is to click the "Tools" tab, then hover over "Port" to see if a COM channel is given. Another method is to try uploading something to your Uno, by opening and uploading one of the example sketches. The classic test for an LCD is the "Hello World" sketch, and can be opened by clicking "File," then "Examples," navigate down to "Adafruit RGB LCD Shield Library," and click "HelloWorld." To upload the sketch, click "Control" and "U" together, or use the "Sketch" tab and click "Upload." If not connected, try a restart of your computer and repeat. You now need to adjust the orange contrast potentiometer on the shield to have the LCD display text. I needed to make quite a lot of counterclockwise turns before the contrast changes occurred. If the Uno-sensor connections were made properly, the LCD will display "HelloWorld."

08354100017360509696168.jpg
Fig. 7 Uno powered on and no sketch uploaded.
07605850017360509753458.jpg
Fig. 8 Hello, World!
08354100017360509696168.jpg
Fig. 7 Uno powered on and no sketch uploaded.
07605850017360509753458.jpg
Fig. 8 Hello, World!
04802490017360509807751.jpg
Fig. 7 Uno powered on and no sketch uploaded.
07605850017360509753458.jpg
Fig. 8 Hello, World!


Once you've ensured the Uno is connected, open up the "FOVcalibratedLCDaverage..." sketch. There are a series of additional “libraries” that this sketch calls upon in order to function, and they must be installed. In the IDE, click the "Tools" tab, then "Manage Libraries..." and use the white search bar to type in and install the following necessary libraries:

LiquidCrystal
Adafruit_VL6180X
Adafruit RGB LCD Shield Library
Adafruit MCP23017 Arduino Library

Once the library installations have finished, perform a test compile of the sketch by pressing Control + R, or use the "Sketch" tab and click "Verify/Compile," then upload the FOV sketch onto the Uno. The LCD should now display the distance and FOV in mm as in the image below, however until you complete the external calibration the reported FOV will be inaccurate.

00463470017360509835976.jpg
Fig 9. Adjusting the contrast trimpot.


Step 5: FOV Calibration


In this step you will calculate the characteristic conversion factors used in the Arduino sketch to convert the measured bellows distance to an average field of view. You will prepare a graph using multiple bellows distances measured by the laser sensor and the corresponding field of view measured with a microscope micrometer. The procedure takes about 30-40 minutes. I use 12 distance measurements with the Balpro1 at ~10 mm intervals (12, 22, 32…. 122). Your conversion factor values will vary depending on the distance between the objective and camera you use, but will also scale proportionally with the magnification of the objective used if they are mutually parfocal. In other words, if you perform the calibration with a 10x objective, but use a different magnification objective of the same focal length for other photos then the FOV reported on the LCD after calibration will be ½ of the reported value if using a 5x objective, 2x for a 20x objective, and so on. The calibration procedure is as follows:

With the sensor mounted, sketch uploaded, and Uno connected and running, open the serial monitor (CTRL+SHIFT+M on windows, or by clicking "Tools," then "Serial Monitor") to view the distance in millimeters reported by the sensor. The sketch will report 2 measurements every second. Although the readout speed can be changed, keep in mind that the LCD screens have a limited refresh rate. If set to refresh very quickly (<0.5 seconds) the LCD text output will become jumbled.

07337390017360509884698.jpg
Fig 10. Grab the data from the serial monitor.


Set the bellows length so that the sensor can take a measurement near the minimum bellows extension. My sensor does not read well below ~12 mm, so I set it to where the sensor reads ~12 +/-2 mm. Next, orient and focus the micrometer slide in the field of view of your camera so that the length of the tick marks are square and in focus over the majority of the live view image (if you use tethered shooting) or captured image. Snap a photo of the micrometer at this distance setting. You will next need to measure the number of pixels between corresponding tick marks in order to calculate the field of view of the image just taken. Open the photograph of the micrometer using a program that can readout the number of pixels; I use “Preview” on MacOS, but the program ImageJ on Windows can also be used.

07601660017360509959967.jpg
Fig 11. Finding Pcal.


To calculate the field of view of the image, determine the number of pixels between the markers on your micrometer. I have a micrometer with 0.2 mm demarcations and use the number of pixels found between every 2 marks (0.4 mm). The FOV of your image is the product of the number of horizontal pixels in a full image (Pfull) with the ratio of the distance between demarcations (I use 2 markers, so 0.4 mm) and the number of pixels found between the demarcations (Pcal):

FOV = Pfull x (0.4/Pcal)

For example, I use a Canon M6 Mark II that captures images with 6960 horizontal pixels. At a bellows distance of ~12 mm on the distance sensor and using a 20x objective, the number of pixels between 2 micrometer demarcations is ~1830 pixels. So the FOV of the image at this bellows distance is:

FOV = 6960 x (0.4/1830) = 1.52 mm

Next you will need to calculate an average distance readout from the sensor using the serial monitor. If not already open, press CTRL+SHIFT+M and uncheck the autoscroll option on the serial monitor window, then copy and paste enough values for a good statistical average to Excel/Origin/whichever program you prefer. I use ~1 minute worth of distance measurements, and calculate the average with Excel. My average readout for the first bellows setting was 14.47 mm.

The first calibration point on my graph is then X(measured distance) = 14.47 mm, Y(FOV) = 1.52 mm. Repeat this procedure by moving the bellows another ~10 mm further, snapping a photo once the micrometer is focused and then calculate the average sensor distance readout from the serial monitor. Exact 10 mm steps are not necessary, but being close will allow you to check that the distance measured by the sensor is linear throughout the bellows extension range.

From the series of X,Y points calculate a linear regression to determine the slope and Y intercept of what should be a linear calibration. My values with a 20x objective are -0.005 and 1.5688, with R(squared) = 0.994. An R(squared) closest to 1 is ideal.

06440370017360510049012.jpg
Fig 12. The linear regression.


Last step: Editing the Sketch Code for your calibration


Replace the slope (X) and Y intercept with your values in the 2 lines of the sketch highlighted below.

07375750017360510112376.jpg
Fig 13. Editing the sketch.


Compile the sketch and upload it to the Uno. Once uploaded, the FOV reported on the LCD will need ~10 seconds to reach a stable moving average value. FOV measurements are most precise with very low ambient light levels (room lights off). The system is now ready and should be re-calibrated every few months, or if the sensor or reflective surface is moved. I found that my sensor and reflective material took several days to fully settle in place because neither was permanently mounted.

If you use multiple cameras, or a variety of different objectives/lenses from different manufacturers that are not mutually parfocal then they will each need a separate calibration. In this case, you could prepare a calibrated sketch for each scenario and load the proper file to the Uno when using it.

I hope that this article is helpful to anyone who might enjoy automated FOV readouts. I am open to suggestions on how to make this article and system better, more accurate, or accessible for more micromineral photographers.

Dr. Travis Olds
Assistant Curator of Minerals
Carnegie Museum of Natural History

*UPDATES: Editing the sketch file is now much simpler. At the top of the sketch you will find just two values to edit with your calibration values (float CTRLX = -0.005; float CTRLY = 1.5688;). The calculated FOV values are now constrained to 2 decimal places for the output, which seems to have fixed the text overlap problem for fast refresh rates. Thank you, Jolyon, for those two fixes. So I have also scaled down the size of the array for rolling averages and increased the refresh rate, and response time on bellows distance changes is MUCH faster (~5 seconds)!

Footnotes

1. Stephan Wolfsried's article





Article has been viewed at least 2685 times.

Discuss this Article

14th May 2020 20:29 UTCJoy Desor Expert

Very nice article! Thanks for sharing!

14th May 2020 22:28 UTCTravis Olds Expert

Thanks, Joy! I figured you are at least one of the handful of people who would find it interesting.

15th May 2020 03:55 UTCJohn A. Jaszczak Expert

Thank you Travis. That is excellent. Thanks for sharing!
John

15th May 2020 09:17 UTCJolyon Ralph Founder

Excellent work!

15th May 2020 09:25 UTCKeith Compton 🌟 Manager

That's an impressive bit of kit.

I must say you lost me at "plan apochromatic objective". But a great informative read. I might have to get you to create one for me !!

Well done.

15th May 2020 09:27 UTCJolyon Ralph Founder

I noticed that you're not constraining the output at the end to a fixed number of decimal places. You may want to try something like this instead.

char lcdtext[32];
sprintf(lcdtext,"AVG FOV %.4f mm",(average * -0.005)+1.5688);
lcd.print(lcdtext);

This should restrict the output to 4 decimal places (adjust the %.4f accordingly if you want fewer or more)

15th May 2020 13:39 UTCTravis Olds Expert

Thanks John, Keith, and Jolyon!

Jolyon, I've added your suggestion to the sketch and will update it on github, thanks for that. I'll use 0.XXX. In the past I had the serial monitor also output calculated FOV for checking, and the excess significant values reported annoyed me but I couldn't figure it out. My cheap way around it was that lcd.print only sent 3 characters of the avg to the LCD, anyway, and I removed the serial FOV printout since it was extra tidying work for the calibration. If I don't see them I don't have to worry about it, right?!

If I were to update this I'd switch to the larger OLED screens that are easier to read:

16th May 2020 09:22 UTCVolker Betz 🌟 Expert

06115180017362680956137.jpg
Hello Travis,
very interesting setup and inspiring. My concern is that this calculated FOV is not the same as visible in the stacked picture. I did recently some comparison of lens focus stacking and rail stacking. I could see that the stacking procedure with helicon focus is cropping the pictures.
So at  theoretical 1:1 scale at the final picture has a reproduction scale 1.28 :1. This is a extreme value  for rail stacking. With lens focus stacking it is  1.14 :1.  I stacked 100 pictures in 400 µm steps for a total DOF of 40 mm at a FOV of ~20 mm. How much the picture is cropped is hidden in the algorithm of the stacking software.

Volker

16th May 2020 16:42 UTCTravis Olds Expert

Volker, you are right. It is a problem I haven't figured out how to account for in the final stacked images. The cropping in Helicon Focus is partly introduced by not stacking exactly on the optical axis of the specimen, when rail movement deviates from the normal plane of the image/view, the program seems to crop the edges that aren't present throughout the stack.


Helicon Focus applies some other scale/tilt correction to each image that cause cropping- mostly a function of how stable your setup is throughout the stack. I am also not sure where to find the cropping values, and I have not tested this, but does it crop diffrent edges depending on the order of the images processed? I wonder if you reverse the order, it will crop opposite edges..? Also, mirrorless cameras (at least mine) introduce some distortion corrections to individual images for rolling shutter problems. The correction can vary at different heights of the image, depending on the level of vibration and shutter speed.

Maybe I am wrong, but in your images it looks like the micrometer plane is tilted several degrees from perpendicular, so the areas furthest away from perpendicular (near the top of the image) have an apparent wider FOV. I think this is an unavoidable error in most photographs, but we do not usually measure field of view at the very top or bottom of an image anyway. Could you repeat the test with the demarcations close to perpendicular? In my setup this distortion is also commonly present because I use a tripod and the rail movement for images proceeds at a slight angle, and my error in sample placement always yields some cropping and stretching in HF because the stack doesn't move on the optic axis. However, the calibration images are not processed in Helicon Focus, and I try to calculate # of pixels at the center of the image, where your eye would naturally choose an area of the vug/crystal group that lies perpendicular (or at least close) to the optic axis. So for stacks that are not highly distorted or shifted in Helicon Focus, the center of the processed image is probably still quite close to the apparent scale of the calibrated image. In whatever case, the measurement is certainly less accurate than +/-0.02 mm.

16th May 2020 18:20 UTCVolker Betz 🌟 Expert

Hello Travis,
The way I measure sizes is usually to set the object in focus on the screen  for two characteristic points and then move it manual (usual horizontal) with a micrometer using the grid in camera tethering. I note the distance and remeasure it on the stacked picture.  From that I calculate the FOV in the stacked picture. Usual I only note 2 digit numbers.

My setup is aligned as much as I can. Also I use different setups, macro lenses, extreme macro lenses, bellows, tubes and telephoto lenses on the same rail. So I have to accept some misaligned parameters as long I get a picture which is OK.  With rail stacking only in a few cases the DOF  is twice the FOV as in my example. Another point is for me: we make pictures and not size and quality control of crystals. So it does (for me) not matter if the size of the crystal is 0.7 or 0.75 mm. I am only sceptic if a three digit number is given, then I have to ask (I was an organic chemsitry environmental  trace analyst  in the ppb range) for the statistic data of this value.

So at all your method is fine, keeping in mind that the stacking software alters the value. Only the third digit is questionable. For the guys who want it exactly: Keyence sells measuring microscopes.

16th May 2020 18:30 UTCTravis Olds Expert

I think the difference in scale for each scenario you tested is related to being off axis. With the rail moving, it will sample an overall larger deviation from the optic axis, whereas with focus stacking the deviation angle does not change.

16th May 2020 20:40 UTCPaul Brandes 🌟 Manager

Impressive, and very useful.
Great piece of work, Travis!

19th May 2020 15:11 UTCTravis Olds Expert

Thanks, Paul!
 
and/or  
Mindat.org is an outreach project of the Hudson Institute of Mineralogy, a 501(c)(3) not-for-profit organization.
Copyright © mindat.org and the Hudson Institute of Mineralogy 1993-2025, except where stated. Most political location boundaries are © OpenStreetMap contributors. Mindat.org relies on the contributions of thousands of members and supporters. Founded in 2000 by Jolyon Ralph.
To cite: Ralph, J., Von Bargen, D., Martynov, P., Zhang, J., Que, X., Prabhu, A., Morrison, S. M., Li, W., Chen, W., & Ma, X. (2025). Mindat.org: The open access mineralogy database to accelerate data-intensive geoscience research. American Mineralogist, 110(6), 833–844. doi:10.2138/am-2024-9486.
Privacy Policy - Terms & Conditions - Contact Us / DMCA issues - Report a bug/vulnerability Current server date and time: September 5, 2025 22:18:11
Go to top of page