Mirror with some correction

Dependencies:   mbed FastIO FastPWM USBDevice

Committer:
arnoz
Date:
Fri Oct 01 08:19:46 2021 +0000
Revision:
116:7a67265d7c19
Parent:
104:6e06e0f4b476
- Correct information regarding your last merge

Who changed what in which revision?

UserRevisionLine numberNew contents of line
mjr 82:4f6209cb5c33 1 // Edge position sensor - 2D optical
mjr 82:4f6209cb5c33 2 //
mjr 82:4f6209cb5c33 3 // This class implements our plunger sensor interface using edge
mjr 82:4f6209cb5c33 4 // detection on a 2D optical sensor. With this setup, a 2D optical
mjr 82:4f6209cb5c33 5 // sensor is placed close to the plunger, parallel to the rod, with a
mjr 82:4f6209cb5c33 6 // light source opposite the plunger. This makes the plunger cast a
mjr 82:4f6209cb5c33 7 // shadow on the sensor. We figure the plunger position by detecting
mjr 82:4f6209cb5c33 8 // where the shadow is, by finding the edge between the bright and
mjr 82:4f6209cb5c33 9 // dark regions in the image.
mjr 82:4f6209cb5c33 10 //
mjr 82:4f6209cb5c33 11 // This class is designed to work with any type of 2D optical sensor.
mjr 82:4f6209cb5c33 12 // We have subclasses for the TSL1410R and TSL1412S sensors, but other
mjr 82:4f6209cb5c33 13 // similar sensors could be supported as well by adding interfaces for
mjr 82:4f6209cb5c33 14 // the physical electronics. For the edge detection, we just need an
mjr 82:4f6209cb5c33 15 // array of pixel readings.
mjr 82:4f6209cb5c33 16
mjr 82:4f6209cb5c33 17 #ifndef _EDGESENSOR_H_
mjr 82:4f6209cb5c33 18 #define _EDGESENSOR_H_
mjr 82:4f6209cb5c33 19
mjr 82:4f6209cb5c33 20 #include "plunger.h"
mjr 82:4f6209cb5c33 21
mjr 82:4f6209cb5c33 22 // Scan method - select a method listed below. Method 2 (find the point
mjr 82:4f6209cb5c33 23 // with maximum brightness slop) seems to work the best so far.
mjr 82:4f6209cb5c33 24 #define SCAN_METHOD 2
mjr 82:4f6209cb5c33 25 //
mjr 82:4f6209cb5c33 26 //
mjr 82:4f6209cb5c33 27 // 0 = One-way scan. This is the original algorithim from the v1 software,
mjr 82:4f6209cb5c33 28 // with some slight improvements. We start at the brighter end of the
mjr 82:4f6209cb5c33 29 // sensor and scan until we find a pixel darker than a threshold level
mjr 82:4f6209cb5c33 30 // (halfway between the respective brightness levels at the bright and
mjr 82:4f6209cb5c33 31 // dark ends of the sensor). The original v1 algorithm simply stopped
mjr 82:4f6209cb5c33 32 // there. This version is slightly improved: it scans for a few more
mjr 82:4f6209cb5c33 33 // pixels to make sure that the majority of the adjacent pixels are
mjr 82:4f6209cb5c33 34 // also in shadow, to help reject false edges from sensor noise or
mjr 82:4f6209cb5c33 35 // optical shadows that make one pixel read darker than it should.
mjr 82:4f6209cb5c33 36 //
mjr 82:4f6209cb5c33 37 // 1 = Meet in the middle. We start two scans concurrently, one from
mjr 82:4f6209cb5c33 38 // the dark end of the sensor and one from the bright end. For
mjr 82:4f6209cb5c33 39 // the scan from the dark end, we stop when we reach a pixel that's
mjr 82:4f6209cb5c33 40 // brighter than the average dark level by 2/3 of the gap between
mjr 82:4f6209cb5c33 41 // the dark and bright levels. For the scan from the bright end,
mjr 82:4f6209cb5c33 42 // we stop when we reach a pixel that's darker by 2/3 of the gap.
mjr 82:4f6209cb5c33 43 // Each time we stop, we look to see if the other scan has reached
mjr 82:4f6209cb5c33 44 // the same place. If so, the two scans converged on a common
mjr 82:4f6209cb5c33 45 // point, which we take to be the edge between the dark and bright
mjr 82:4f6209cb5c33 46 // sections. If the two scans haven't converged yet, we switch to
mjr 82:4f6209cb5c33 47 // the other scan and continue it. We repeat this process until
mjr 82:4f6209cb5c33 48 // the two converge. The benefit of this approach vs the older
mjr 82:4f6209cb5c33 49 // one-way scan is that it's much more tolerant of noise, and the
mjr 82:4f6209cb5c33 50 // degree of noise tolerance is dictated by how noisy the signal
mjr 82:4f6209cb5c33 51 // actually is. The dynamic degree of tolerance is good because
mjr 82:4f6209cb5c33 52 // higher noise tolerance tends to result in reduced resolution.
mjr 82:4f6209cb5c33 53 //
mjr 82:4f6209cb5c33 54 // 2 = Maximum dL/ds (highest first derivative of luminance change per
mjr 82:4f6209cb5c33 55 // distance, or put another way, the steepest brightness slope).
mjr 82:4f6209cb5c33 56 // This scans the whole image and looks for the position with the
mjr 82:4f6209cb5c33 57 // highest dL/ds value. We average over a window of several pixels,
mjr 82:4f6209cb5c33 58 // to smooth out pixel noise; this should avoid treating a single
mjr 82:4f6209cb5c33 59 // spiky pixel as having a steep slope adjacent to it. The advantage
mjr 82:4f6209cb5c33 60 // in this approach is that it looks for the strongest edge after
mjr 82:4f6209cb5c33 61 // considering all edges across the whole image, which should make
mjr 82:4f6209cb5c33 62 // it less likely to be fooled by isolated noise that creates a
mjr 82:4f6209cb5c33 63 // single false edge. Algorithms 1 and 2 have basically fixed
mjr 82:4f6209cb5c33 64 // thresholds for what constitutes an edge, but this approach is
mjr 82:4f6209cb5c33 65 // more dynamic in that it evaluates each edge-like region and picks
mjr 82:4f6209cb5c33 66 // the best one. The width of the edge is still fixed, since that's
mjr 82:4f6209cb5c33 67 // determined by the pixel window. But that should be okay since we
mjr 82:4f6209cb5c33 68 // only deal with one type of image. It should be possible to adjust
mjr 82:4f6209cb5c33 69 // the light source and sensor position to always yield an image with
mjr 82:4f6209cb5c33 70 // a narrow enough edge region.
mjr 82:4f6209cb5c33 71 //
mjr 82:4f6209cb5c33 72 // The max dL/ds method is the most compute-intensive method, because
mjr 82:4f6209cb5c33 73 // of the pixel window averaging. An assembly language implemementation
mjr 82:4f6209cb5c33 74 // seems to be needed to make it fast enough on the KL25Z. This method
mjr 82:4f6209cb5c33 75 // has a fixed run time because it always does exactly one pass over
mjr 82:4f6209cb5c33 76 // the whole pixel array.
mjr 82:4f6209cb5c33 77 //
mjr 82:4f6209cb5c33 78 // 3 = Total bright pixel count. This simply adds up the total number
mjr 82:4f6209cb5c33 79 // of pixels above a threshold brightness, without worrying about
mjr 82:4f6209cb5c33 80 // whether they're contiguous with other pixels on the same side
mjr 82:4f6209cb5c33 81 // of the edge. Since we know there's always exactly one edge,
mjr 82:4f6209cb5c33 82 // all of the dark pixels should in principle be on one side, and
mjr 82:4f6209cb5c33 83 // all of the light pixels should be on the other side. There
mjr 82:4f6209cb5c33 84 // might be some noise that creates isolated pixels that don't
mjr 82:4f6209cb5c33 85 // match their neighbors, but these should average out. The virtue
mjr 82:4f6209cb5c33 86 // of this approach (apart from its simplicity) is that it should
mjr 82:4f6209cb5c33 87 // be immune to false edges - local spikes due to noise - that
mjr 82:4f6209cb5c33 88 // might fool the algorithms that explicitly look for edges. In
mjr 82:4f6209cb5c33 89 // practice, though, it seems to be even more sensitive to noise
mjr 82:4f6209cb5c33 90 // than the other algorithms, probably because it treats every pixel
mjr 82:4f6209cb5c33 91 // as independent and thus doesn't have any sort of inherent noise
mjr 82:4f6209cb5c33 92 // reduction from considering relationships among pixels.
mjr 82:4f6209cb5c33 93 //
mjr 82:4f6209cb5c33 94
mjr 82:4f6209cb5c33 95 // assembler routine to scan for an edge using "mode 2" (maximum slope)
mjr 104:6e06e0f4b476 96 extern "C" int edgeScanMode2(const uint8_t *pix, int npix, const uint8_t **edgePtr, int dir);
mjr 82:4f6209cb5c33 97
mjr 82:4f6209cb5c33 98 // PlungerSensor interface implementation for edge detection setups.
mjr 82:4f6209cb5c33 99 // This is a generic base class for image-based sensors where we detect
mjr 82:4f6209cb5c33 100 // the plunger position by finding the edge of the shadow it casts on
mjr 82:4f6209cb5c33 101 // the detector.
mjr 87:8d35c74403af 102 //
mjr 87:8d35c74403af 103 // Edge sensors use the image pixel span as the native position scale,
mjr 87:8d35c74403af 104 // since a position reading is the pixel offset of the shadow edge.
mjr 87:8d35c74403af 105 class PlungerSensorEdgePos: public PlungerSensorImage<int>
mjr 82:4f6209cb5c33 106 {
mjr 82:4f6209cb5c33 107 public:
mjr 104:6e06e0f4b476 108 PlungerSensorEdgePos(PlungerSensorImageInterface &sensor, int npix)
mjr 104:6e06e0f4b476 109 : PlungerSensorImage(sensor, npix, npix - 1)
mjr 82:4f6209cb5c33 110 {
mjr 82:4f6209cb5c33 111 }
mjr 82:4f6209cb5c33 112
mjr 82:4f6209cb5c33 113 // Process an image - scan for the shadow edge to determine the plunger
mjr 82:4f6209cb5c33 114 // position.
mjr 82:4f6209cb5c33 115 //
mjr 82:4f6209cb5c33 116 // If we detect the plunger position, we set 'pos' to the pixel location
mjr 82:4f6209cb5c33 117 // of the edge and return true; otherwise we return false. The 'pos'
mjr 82:4f6209cb5c33 118 // value returned, if any, is adjusted for sensor orientation so that
mjr 82:4f6209cb5c33 119 // it reflects the logical plunger position (i.e., distance retracted,
mjr 82:4f6209cb5c33 120 // where 0 is always the fully forward position and 'n' is fully
mjr 82:4f6209cb5c33 121 // retracted).
mjr 82:4f6209cb5c33 122
mjr 82:4f6209cb5c33 123 #if SCAN_METHOD == 0
mjr 82:4f6209cb5c33 124 // Scan method 0: one-way scan; original method used in v1 firmware.
mjr 87:8d35c74403af 125 bool process(const uint8_t *pix, int n, int &pos, int& /*processResult*/)
mjr 82:4f6209cb5c33 126 {
mjr 82:4f6209cb5c33 127 // Get the levels at each end
mjr 82:4f6209cb5c33 128 int a = (int(pix[0]) + pix[1] + pix[2] + pix[3] + pix[4])/5;
mjr 82:4f6209cb5c33 129 int b = (int(pix[n-1]) + pix[n-2] + pix[n-3] + pix[n-4] + pix[n-5])/5;
mjr 82:4f6209cb5c33 130
mjr 82:4f6209cb5c33 131 // Figure the sensor orientation based on the relative brightness
mjr 82:4f6209cb5c33 132 // levels at the opposite ends of the image. We're going to scan
mjr 82:4f6209cb5c33 133 // across the image from each side - 'bi' is the starting index
mjr 82:4f6209cb5c33 134 // scanning from the bright side, 'di' is the starting index on
mjr 82:4f6209cb5c33 135 // the dark side. 'binc' and 'dinc' are the pixel increments
mjr 82:4f6209cb5c33 136 // for the respective indices.
mjr 82:4f6209cb5c33 137 int bi;
mjr 82:4f6209cb5c33 138 if (a > b+10)
mjr 82:4f6209cb5c33 139 {
mjr 82:4f6209cb5c33 140 // left end is brighter - standard orientation
mjr 82:4f6209cb5c33 141 dir = 1;
mjr 82:4f6209cb5c33 142 bi = 4;
mjr 82:4f6209cb5c33 143 }
mjr 82:4f6209cb5c33 144 else if (b > a+10)
mjr 82:4f6209cb5c33 145 {
mjr 82:4f6209cb5c33 146 // right end is brighter - reverse orientation
mjr 82:4f6209cb5c33 147 dir = -1;
mjr 82:4f6209cb5c33 148 bi = n - 5;
mjr 82:4f6209cb5c33 149 }
mjr 82:4f6209cb5c33 150 else if (dir != 0)
mjr 82:4f6209cb5c33 151 {
mjr 82:4f6209cb5c33 152 // We don't have enough contrast to detect the orientation
mjr 82:4f6209cb5c33 153 // from this image, so either the image is too overexposed
mjr 82:4f6209cb5c33 154 // or underexposed to be useful, or the entire sensor is in
mjr 82:4f6209cb5c33 155 // light or darkness. We'll assume the latter: the plunger
mjr 82:4f6209cb5c33 156 // is blocking the whole window or isn't in the frame at
mjr 82:4f6209cb5c33 157 // all. We'll also assume that the exposure level is
mjr 82:4f6209cb5c33 158 // similar to that in recent frames where we *did* detect
mjr 82:4f6209cb5c33 159 // the direction. This means that if the new exposure level
mjr 82:4f6209cb5c33 160 // (which is about the same over the whole array) is less
mjr 82:4f6209cb5c33 161 // than the recent midpoint, we must be entirely blocked
mjr 82:4f6209cb5c33 162 // by the plunger, so it's all the way forward; if the
mjr 82:4f6209cb5c33 163 // brightness is above the recent midpoint, we must be
mjr 82:4f6209cb5c33 164 // entirely exposed, so the plunger is all the way back.
mjr 82:4f6209cb5c33 165
mjr 82:4f6209cb5c33 166 // figure the average of the recent midpoint brightnesses
mjr 82:4f6209cb5c33 167 int sum = 0;
mjr 82:4f6209cb5c33 168 for (int i = 0 ; i < countof(midpt) ; sum += midpt[i++]) ;
mjr 82:4f6209cb5c33 169 sum /= countof(midpt);
mjr 82:4f6209cb5c33 170
mjr 82:4f6209cb5c33 171 // Figure the average of our two ends. We have very
mjr 82:4f6209cb5c33 172 // little contrast overall, so we already know that the
mjr 82:4f6209cb5c33 173 // two ends are about the same, but we can't expect the
mjr 82:4f6209cb5c33 174 // lighting to be perfectly uniform. Averaging the ends
mjr 82:4f6209cb5c33 175 // will smooth out variations due to light source placement,
mjr 82:4f6209cb5c33 176 // sensor noise, etc.
mjr 82:4f6209cb5c33 177 a = (a+b)/2;
mjr 82:4f6209cb5c33 178
mjr 104:6e06e0f4b476 179 // Check if we seem to be fully exposed or fully covered.
mjr 82:4f6209cb5c33 180 pos = a < sum ? 0 : n;
mjr 104:6e06e0f4b476 181
mjr 104:6e06e0f4b476 182 // stop here with a successful reading
mjr 82:4f6209cb5c33 183 return true;
mjr 82:4f6209cb5c33 184 }
mjr 82:4f6209cb5c33 185 else
mjr 82:4f6209cb5c33 186 {
mjr 82:4f6209cb5c33 187 // We can't detect the orientation from this image, and
mjr 82:4f6209cb5c33 188 // we don't know it from previous images, so we have nothing
mjr 82:4f6209cb5c33 189 // to go on. Give up and return failure.
mjr 82:4f6209cb5c33 190 return false;
mjr 82:4f6209cb5c33 191 }
mjr 82:4f6209cb5c33 192
mjr 82:4f6209cb5c33 193 // Figure the crossover brightness levels for detecting the edge.
mjr 82:4f6209cb5c33 194 // The midpoint is the brightness level halfway between the bright
mjr 82:4f6209cb5c33 195 // and dark regions we detected at the opposite ends of the sensor.
mjr 82:4f6209cb5c33 196 // To find the edge, we'll look for a brightness level slightly
mjr 82:4f6209cb5c33 197 // *past* the midpoint, to help reject noise - the bright region
mjr 82:4f6209cb5c33 198 // pixels should all cluster close to the higher level, and the
mjr 82:4f6209cb5c33 199 // shadow region should all cluster close to the lower level.
mjr 82:4f6209cb5c33 200 // We'll define "close" as within 1/3 of the gap between the
mjr 82:4f6209cb5c33 201 // extremes.
mjr 82:4f6209cb5c33 202 int mid = (a+b)/2;
mjr 82:4f6209cb5c33 203
mjr 82:4f6209cb5c33 204 // Scan from the bright side looking, for a pixel that drops below the
mjr 82:4f6209cb5c33 205 // midpoint brightess. To reduce false positives from noise, check to
mjr 82:4f6209cb5c33 206 // see if the majority of the next few pixels stay in shadow - if not,
mjr 82:4f6209cb5c33 207 // consider the dark pixel to be some kind of transient noise, and
mjr 82:4f6209cb5c33 208 // continue looking for a more solid edge.
mjr 82:4f6209cb5c33 209 for (int i = 5 ; i < n-5 ; ++i, bi += dir)
mjr 82:4f6209cb5c33 210 {
mjr 82:4f6209cb5c33 211 // check to see if we found a dark pixel
mjr 82:4f6209cb5c33 212 if (pix[bi] < mid)
mjr 82:4f6209cb5c33 213 {
mjr 82:4f6209cb5c33 214 // make sure we have a sustained edge
mjr 82:4f6209cb5c33 215 int ok = 0;
mjr 82:4f6209cb5c33 216 int bi2 = bi + dir;
mjr 82:4f6209cb5c33 217 for (int j = 0 ; j < 5 ; ++j, bi2 += dir)
mjr 82:4f6209cb5c33 218 {
mjr 82:4f6209cb5c33 219 // count this pixel if it's darker than the midpoint
mjr 82:4f6209cb5c33 220 if (pix[bi2] < mid)
mjr 82:4f6209cb5c33 221 ++ok;
mjr 82:4f6209cb5c33 222 }
mjr 82:4f6209cb5c33 223
mjr 82:4f6209cb5c33 224 // if we're clearly in the dark section, we have our edge
mjr 82:4f6209cb5c33 225 if (ok > 3)
mjr 82:4f6209cb5c33 226 {
mjr 82:4f6209cb5c33 227 // Success. Since we found an edge in this scan, save the
mjr 82:4f6209cb5c33 228 // midpoint brightness level in our history list, to help
mjr 82:4f6209cb5c33 229 // with any future frames with insufficient contrast.
mjr 82:4f6209cb5c33 230 midpt[midptIdx++] = mid;
mjr 82:4f6209cb5c33 231 midptIdx %= countof(midpt);
mjr 82:4f6209cb5c33 232
mjr 82:4f6209cb5c33 233 // return the detected position
mjr 82:4f6209cb5c33 234 pos = i;
mjr 82:4f6209cb5c33 235 return true;
mjr 82:4f6209cb5c33 236 }
mjr 82:4f6209cb5c33 237 }
mjr 82:4f6209cb5c33 238 }
mjr 82:4f6209cb5c33 239
mjr 82:4f6209cb5c33 240 // no edge found
mjr 82:4f6209cb5c33 241 return false;
mjr 82:4f6209cb5c33 242 }
mjr 82:4f6209cb5c33 243 #endif // SCAN_METHOD 0
mjr 82:4f6209cb5c33 244
mjr 82:4f6209cb5c33 245 #if SCAN_METHOD == 1
mjr 82:4f6209cb5c33 246 // Scan method 1: meet in the middle.
mjr 87:8d35c74403af 247 bool process(const uint8_t *pix, int n, int &pos, int& /*processResult*/)
mjr 82:4f6209cb5c33 248 {
mjr 82:4f6209cb5c33 249 // Get the levels at each end
mjr 82:4f6209cb5c33 250 int a = (int(pix[0]) + pix[1] + pix[2] + pix[3] + pix[4])/5;
mjr 82:4f6209cb5c33 251 int b = (int(pix[n-1]) + pix[n-2] + pix[n-3] + pix[n-4] + pix[n-5])/5;
mjr 82:4f6209cb5c33 252
mjr 82:4f6209cb5c33 253 // Figure the sensor orientation based on the relative brightness
mjr 82:4f6209cb5c33 254 // levels at the opposite ends of the image. We're going to scan
mjr 82:4f6209cb5c33 255 // across the image from each side - 'bi' is the starting index
mjr 82:4f6209cb5c33 256 // scanning from the bright side, 'di' is the starting index on
mjr 82:4f6209cb5c33 257 // the dark side. 'binc' and 'dinc' are the pixel increments
mjr 82:4f6209cb5c33 258 // for the respective indices.
mjr 82:4f6209cb5c33 259 int bi, di;
mjr 82:4f6209cb5c33 260 int binc, dinc;
mjr 82:4f6209cb5c33 261 if (a > b+10)
mjr 82:4f6209cb5c33 262 {
mjr 82:4f6209cb5c33 263 // left end is brighter - standard orientation
mjr 82:4f6209cb5c33 264 dir = 1;
mjr 82:4f6209cb5c33 265 bi = 4, di = n - 5;
mjr 82:4f6209cb5c33 266 binc = 1, dinc = -1;
mjr 82:4f6209cb5c33 267 }
mjr 82:4f6209cb5c33 268 else if (b > a+10)
mjr 82:4f6209cb5c33 269 {
mjr 82:4f6209cb5c33 270 // right end is brighter - reverse orientation
mjr 82:4f6209cb5c33 271 dir = -1;
mjr 82:4f6209cb5c33 272 bi = n - 5, di = 4;
mjr 82:4f6209cb5c33 273 binc = -1, dinc = 1;
mjr 82:4f6209cb5c33 274 }
mjr 82:4f6209cb5c33 275 else
mjr 82:4f6209cb5c33 276 {
mjr 82:4f6209cb5c33 277 // can't detect direction
mjr 82:4f6209cb5c33 278 return false;
mjr 82:4f6209cb5c33 279 }
mjr 82:4f6209cb5c33 280
mjr 82:4f6209cb5c33 281 // Figure the crossover brightness levels for detecting the edge.
mjr 82:4f6209cb5c33 282 // The midpoint is the brightness level halfway between the bright
mjr 82:4f6209cb5c33 283 // and dark regions we detected at the opposite ends of the sensor.
mjr 82:4f6209cb5c33 284 // To find the edge, we'll look for a brightness level slightly
mjr 82:4f6209cb5c33 285 // *past* the midpoint, to help reject noise - the bright region
mjr 82:4f6209cb5c33 286 // pixels should all cluster close to the higher level, and the
mjr 82:4f6209cb5c33 287 // shadow region should all cluster close to the lower level.
mjr 82:4f6209cb5c33 288 // We'll define "close" as within 1/3 of the gap between the
mjr 82:4f6209cb5c33 289 // extremes.
mjr 82:4f6209cb5c33 290 int mid = (a+b)/2;
mjr 82:4f6209cb5c33 291 int delta6 = abs(a-b)/6;
mjr 82:4f6209cb5c33 292 int crossoverHi = mid + delta6;
mjr 82:4f6209cb5c33 293 int crossoverLo = mid - delta6;
mjr 82:4f6209cb5c33 294
mjr 82:4f6209cb5c33 295 // Scan inward from the each end, looking for edges. Each time we
mjr 82:4f6209cb5c33 296 // find an edge from one direction, we'll see if the scan from the
mjr 82:4f6209cb5c33 297 // other direction agrees. If it does, we have a winner. If they
mjr 82:4f6209cb5c33 298 // don't agree, we must have found some noise in one direction or the
mjr 82:4f6209cb5c33 299 // other, so switch sides and continue the scan. On each continued
mjr 82:4f6209cb5c33 300 // scan, if the stopping point from the last scan *was* noise, we'll
mjr 82:4f6209cb5c33 301 // start seeing the expected non-edge pixels again as we move on,
mjr 82:4f6209cb5c33 302 // so we'll effectively factor out the noise. If what stopped us
mjr 82:4f6209cb5c33 303 // *wasn't* noise but was a legitimate edge, we'll see that we're
mjr 82:4f6209cb5c33 304 // still in the region that stopped us in the first place and just
mjr 82:4f6209cb5c33 305 // stop again immediately.
mjr 82:4f6209cb5c33 306 //
mjr 82:4f6209cb5c33 307 // The two sides have to converge, because they march relentlessly
mjr 82:4f6209cb5c33 308 // towards each other until they cross. Even if we have a totally
mjr 82:4f6209cb5c33 309 // random bunch of pixels, the two indices will eventually meet and
mjr 82:4f6209cb5c33 310 // we'll declare that to be the edge position. The processing time
mjr 82:4f6209cb5c33 311 // is linear in the pixel count - it's equivalent to one pass over
mjr 82:4f6209cb5c33 312 // the pixels. The measured time for 1280 pixels is about 1.3ms,
mjr 82:4f6209cb5c33 313 // which is about half the DMA transfer time. Our goal is always
mjr 82:4f6209cb5c33 314 // to complete the processing in less than the DMA transfer time,
mjr 82:4f6209cb5c33 315 // since that's as fast as we can possibly go with the physical
mjr 82:4f6209cb5c33 316 // sensor. Since our processing time is overlapped with the DMA
mjr 82:4f6209cb5c33 317 // transfer, the overall frame rate is limited by the *longer* of
mjr 82:4f6209cb5c33 318 // the two times, not the sum of the two times. So as long as the
mjr 82:4f6209cb5c33 319 // processing takes less time than the DMA transfer, we're not
mjr 82:4f6209cb5c33 320 // contributing at all to the overall frame rate limit - it's like
mjr 82:4f6209cb5c33 321 // we're not even here.
mjr 82:4f6209cb5c33 322 for (;;)
mjr 82:4f6209cb5c33 323 {
mjr 82:4f6209cb5c33 324 // scan from the bright side
mjr 82:4f6209cb5c33 325 for (bi += binc ; bi >= 5 && bi <= n-6 ; bi += binc)
mjr 82:4f6209cb5c33 326 {
mjr 82:4f6209cb5c33 327 // if we found a dark pixel, consider it to be an edge
mjr 82:4f6209cb5c33 328 if (pix[bi] < crossoverLo)
mjr 82:4f6209cb5c33 329 break;
mjr 82:4f6209cb5c33 330 }
mjr 82:4f6209cb5c33 331
mjr 82:4f6209cb5c33 332 // if we reached an extreme, return failure
mjr 82:4f6209cb5c33 333 if (bi < 5 || bi > n-6)
mjr 82:4f6209cb5c33 334 return false;
mjr 82:4f6209cb5c33 335
mjr 82:4f6209cb5c33 336 // if the two directions crossed, we have a winner
mjr 82:4f6209cb5c33 337 if (binc > 0 ? bi >= di : bi <= di)
mjr 82:4f6209cb5c33 338 {
mjr 82:4f6209cb5c33 339 pos = (dir == 1 ? bi : n - bi);
mjr 82:4f6209cb5c33 340 return true;
mjr 82:4f6209cb5c33 341 }
mjr 82:4f6209cb5c33 342
mjr 82:4f6209cb5c33 343 // they haven't converged yet, so scan from the dark side
mjr 82:4f6209cb5c33 344 for (di += dinc ; di >= 5 && di <= n-6 ; di += dinc)
mjr 82:4f6209cb5c33 345 {
mjr 82:4f6209cb5c33 346 // if we found a bright pixel, consider it to be an edge
mjr 82:4f6209cb5c33 347 if (pix[di] > crossoverHi)
mjr 82:4f6209cb5c33 348 break;
mjr 82:4f6209cb5c33 349 }
mjr 82:4f6209cb5c33 350
mjr 82:4f6209cb5c33 351 // if we reached an extreme, return failure
mjr 82:4f6209cb5c33 352 if (di < 5 || di > n-6)
mjr 82:4f6209cb5c33 353 return false;
mjr 82:4f6209cb5c33 354
mjr 82:4f6209cb5c33 355 // if they crossed now, we have a winner
mjr 82:4f6209cb5c33 356 if (binc > 0 ? bi >= di : bi <= di)
mjr 82:4f6209cb5c33 357 {
mjr 82:4f6209cb5c33 358 pos = (dir == 1 ? di : n - di);
mjr 82:4f6209cb5c33 359 return true;
mjr 82:4f6209cb5c33 360 }
mjr 82:4f6209cb5c33 361 }
mjr 82:4f6209cb5c33 362 }
mjr 82:4f6209cb5c33 363 #endif // SCAN METHOD 1
mjr 82:4f6209cb5c33 364
mjr 82:4f6209cb5c33 365 #if SCAN_METHOD == 2
mjr 82:4f6209cb5c33 366 // Scan method 2: scan for steepest brightness slope.
mjr 87:8d35c74403af 367 virtual bool process(const uint8_t *pix, int n, int &pos, int& /*processResult*/)
mjr 82:4f6209cb5c33 368 {
mjr 82:4f6209cb5c33 369 // Get the levels at each end by averaging across several pixels.
mjr 82:4f6209cb5c33 370 // Compute just the sums: don't bother dividing by the count, since
mjr 82:4f6209cb5c33 371 // the sums are equivalent to the averages as long as we know
mjr 82:4f6209cb5c33 372 // everything is multiplied by the number of samples.
mjr 82:4f6209cb5c33 373 int a = (int(pix[0]) + pix[1] + pix[2] + pix[3] + pix[4]);
mjr 82:4f6209cb5c33 374 int b = (int(pix[n-1]) + pix[n-2] + pix[n-3] + pix[n-4] + pix[n-5]);
mjr 82:4f6209cb5c33 375
mjr 82:4f6209cb5c33 376 // Figure the sensor orientation based on the relative brightness
mjr 82:4f6209cb5c33 377 // levels at the opposite ends of the image. We're going to scan
mjr 82:4f6209cb5c33 378 // across the image from each side - 'bi' is the starting index
mjr 82:4f6209cb5c33 379 // scanning from the bright side, 'di' is the starting index on
mjr 82:4f6209cb5c33 380 // the dark side. 'binc' and 'dinc' are the pixel increments
mjr 82:4f6209cb5c33 381 // for the respective indices.
mjr 82:4f6209cb5c33 382 if (a > b + 50)
mjr 82:4f6209cb5c33 383 {
mjr 82:4f6209cb5c33 384 // left end is brighter - standard orientation
mjr 82:4f6209cb5c33 385 dir = 1;
mjr 82:4f6209cb5c33 386 }
mjr 82:4f6209cb5c33 387 else if (b > a + 50)
mjr 82:4f6209cb5c33 388 {
mjr 82:4f6209cb5c33 389 // right end is brighter - reverse orientation
mjr 82:4f6209cb5c33 390 dir = -1;
mjr 82:4f6209cb5c33 391 }
mjr 82:4f6209cb5c33 392 else
mjr 82:4f6209cb5c33 393 {
mjr 82:4f6209cb5c33 394 // can't determine direction
mjr 82:4f6209cb5c33 395 return false;
mjr 82:4f6209cb5c33 396 }
mjr 82:4f6209cb5c33 397
mjr 82:4f6209cb5c33 398 // scan for the steepest edge using the assembly language
mjr 82:4f6209cb5c33 399 // implementation (since the C++ version is too slow)
mjr 82:4f6209cb5c33 400 const uint8_t *edgep = 0;
mjr 82:4f6209cb5c33 401 if (edgeScanMode2(pix, n, &edgep, dir))
mjr 82:4f6209cb5c33 402 {
mjr 82:4f6209cb5c33 403 // edgep has the pixel array pointer; convert it to an offset
mjr 82:4f6209cb5c33 404 pos = edgep - pix;
mjr 82:4f6209cb5c33 405
mjr 82:4f6209cb5c33 406 // if the sensor orientation is reversed, figure the index from
mjr 82:4f6209cb5c33 407 // the other end of the array
mjr 82:4f6209cb5c33 408 if (dir < 0)
mjr 82:4f6209cb5c33 409 pos = n - pos;
mjr 82:4f6209cb5c33 410
mjr 82:4f6209cb5c33 411 // success
mjr 82:4f6209cb5c33 412 return true;
mjr 82:4f6209cb5c33 413 }
mjr 82:4f6209cb5c33 414 else
mjr 82:4f6209cb5c33 415 {
mjr 82:4f6209cb5c33 416 // no edge found
mjr 82:4f6209cb5c33 417 return false;
mjr 82:4f6209cb5c33 418 }
mjr 82:4f6209cb5c33 419
mjr 82:4f6209cb5c33 420 }
mjr 82:4f6209cb5c33 421 #endif // SCAN_METHOD 2
mjr 82:4f6209cb5c33 422
mjr 82:4f6209cb5c33 423 #if SCAN_METHOD == 3
mjr 82:4f6209cb5c33 424 // Scan method 0: one-way scan; original method used in v1 firmware.
mjr 87:8d35c74403af 425 bool process(const uint8_t *pix, int n, int &pos, int& /*processResult*/)
mjr 82:4f6209cb5c33 426 {
mjr 82:4f6209cb5c33 427 // Get the levels at each end
mjr 82:4f6209cb5c33 428 int a = (int(pix[0]) + pix[1] + pix[2] + pix[3] + pix[4])/5;
mjr 82:4f6209cb5c33 429 int b = (int(pix[n-1]) + pix[n-2] + pix[n-3] + pix[n-4] + pix[n-5])/5;
mjr 82:4f6209cb5c33 430
mjr 82:4f6209cb5c33 431 // Figure the sensor orientation based on the relative brightness
mjr 82:4f6209cb5c33 432 // levels at the opposite ends of the image. We're going to scan
mjr 82:4f6209cb5c33 433 // across the image from each side - 'bi' is the starting index
mjr 82:4f6209cb5c33 434 // scanning from the bright side, 'di' is the starting index on
mjr 82:4f6209cb5c33 435 // the dark side. 'binc' and 'dinc' are the pixel increments
mjr 82:4f6209cb5c33 436 // for the respective indices.
mjr 82:4f6209cb5c33 437 if (a > b+10)
mjr 82:4f6209cb5c33 438 {
mjr 82:4f6209cb5c33 439 // left end is brighter - standard orientation
mjr 82:4f6209cb5c33 440 dir = 1;
mjr 82:4f6209cb5c33 441 }
mjr 82:4f6209cb5c33 442 else if (b > a+10)
mjr 82:4f6209cb5c33 443 {
mjr 82:4f6209cb5c33 444 // right end is brighter - reverse orientation
mjr 82:4f6209cb5c33 445 dir = -1;
mjr 82:4f6209cb5c33 446 }
mjr 82:4f6209cb5c33 447 else
mjr 82:4f6209cb5c33 448 {
mjr 82:4f6209cb5c33 449 // We can't detect the orientation from this image
mjr 82:4f6209cb5c33 450 return false;
mjr 82:4f6209cb5c33 451 }
mjr 82:4f6209cb5c33 452
mjr 82:4f6209cb5c33 453 // Figure the crossover brightness levels for detecting the edge.
mjr 82:4f6209cb5c33 454 // The midpoint is the brightness level halfway between the bright
mjr 82:4f6209cb5c33 455 // and dark regions we detected at the opposite ends of the sensor.
mjr 82:4f6209cb5c33 456 // To find the edge, we'll look for a brightness level slightly
mjr 82:4f6209cb5c33 457 // *past* the midpoint, to help reject noise - the bright region
mjr 82:4f6209cb5c33 458 // pixels should all cluster close to the higher level, and the
mjr 82:4f6209cb5c33 459 // shadow region should all cluster close to the lower level.
mjr 82:4f6209cb5c33 460 // We'll define "close" as within 1/3 of the gap between the
mjr 82:4f6209cb5c33 461 // extremes.
mjr 82:4f6209cb5c33 462 int mid = (a+b)/2;
mjr 82:4f6209cb5c33 463
mjr 82:4f6209cb5c33 464 // Count pixels brighter than the brightness midpoint. We assume
mjr 82:4f6209cb5c33 465 // that all of the bright pixels are contiguously within the bright
mjr 82:4f6209cb5c33 466 // region, so we simply have to count them up. Even if we have a
mjr 82:4f6209cb5c33 467 // few noisy pixels in the dark region above the midpoint, these
mjr 82:4f6209cb5c33 468 // should on average be canceled out by anomalous dark pixels in
mjr 82:4f6209cb5c33 469 // the bright region.
mjr 82:4f6209cb5c33 470 int bcnt = 0;
mjr 82:4f6209cb5c33 471 for (int i = 0 ; i < n ; ++i)
mjr 82:4f6209cb5c33 472 {
mjr 82:4f6209cb5c33 473 if (pix[i] > mid)
mjr 82:4f6209cb5c33 474 ++bcnt;
mjr 82:4f6209cb5c33 475 }
mjr 82:4f6209cb5c33 476
mjr 82:4f6209cb5c33 477 // The position is simply the size of the bright region
mjr 82:4f6209cb5c33 478 pos = bcnt;
mjr 82:4f6209cb5c33 479 if (dir < 1)
mjr 82:4f6209cb5c33 480 pos = n - pos;
mjr 82:4f6209cb5c33 481 return true;
mjr 82:4f6209cb5c33 482 }
mjr 82:4f6209cb5c33 483 #endif // SCAN_METHOD 3
mjr 82:4f6209cb5c33 484
mjr 82:4f6209cb5c33 485
mjr 82:4f6209cb5c33 486 protected:
mjr 82:4f6209cb5c33 487 // Sensor orientation. +1 means that the "tip" end - which is always
mjr 82:4f6209cb5c33 488 // the brighter end in our images - is at the 0th pixel in the array.
mjr 82:4f6209cb5c33 489 // -1 means that the tip is at the nth pixel in the array. 0 means
mjr 82:4f6209cb5c33 490 // that we haven't figured it out yet. We automatically infer this
mjr 82:4f6209cb5c33 491 // from the relative light levels at each end of the array when we
mjr 82:4f6209cb5c33 492 // successfully find a shadow edge. The reason we save the information
mjr 82:4f6209cb5c33 493 // is that we might occasionally get frames that are fully in shadow
mjr 82:4f6209cb5c33 494 // or fully in light, and we can't infer the direction from such
mjr 82:4f6209cb5c33 495 // frames. Saving the information from past frames gives us a fallback
mjr 82:4f6209cb5c33 496 // when we can't infer it from the current frame. Note that we update
mjr 82:4f6209cb5c33 497 // this each time we can infer the direction, so the device will adapt
mjr 82:4f6209cb5c33 498 // on the fly even if the user repositions the sensor while the software
mjr 82:4f6209cb5c33 499 // is running.
mjr 82:4f6209cb5c33 500 virtual int getOrientation() const { return dir; }
mjr 82:4f6209cb5c33 501 int dir;
mjr 82:4f6209cb5c33 502
mjr 82:4f6209cb5c33 503 // History of midpoint brightness levels for the last few successful
mjr 82:4f6209cb5c33 504 // scans. This is a circular buffer that we write on each scan where
mjr 82:4f6209cb5c33 505 // we successfully detect a shadow edge. (It's circular, so we
mjr 82:4f6209cb5c33 506 // effectively discard the oldest element whenever we write a new one.)
mjr 82:4f6209cb5c33 507 //
mjr 82:4f6209cb5c33 508 // We use the history in cases where we have too little contrast to
mjr 82:4f6209cb5c33 509 // detect an edge. In these cases, we assume that the entire sensor
mjr 82:4f6209cb5c33 510 // is either in shadow or light, which can happen if the plunger is at
mjr 82:4f6209cb5c33 511 // one extreme or the other such that the edge of its shadow is out of
mjr 82:4f6209cb5c33 512 // the frame. (Ideally, the sensor should be positioned so that the
mjr 82:4f6209cb5c33 513 // shadow edge is always in the frame, but it's not always possible
mjr 82:4f6209cb5c33 514 // to do this given the constrained space within a cabinet.) The
mjr 82:4f6209cb5c33 515 // history helps us decide which case we have - all shadow or all
mjr 82:4f6209cb5c33 516 // light - by letting us compare our average pixel level in this
mjr 82:4f6209cb5c33 517 // frame to the range in recent frames. This assumes that the
mjr 82:4f6209cb5c33 518 // exposure level is fairly consistent from frame to frame, which
mjr 82:4f6209cb5c33 519 // is usually true because the sensor and light source are both
mjr 82:4f6209cb5c33 520 // fixed in place.
mjr 82:4f6209cb5c33 521 //
mjr 82:4f6209cb5c33 522 // We always try first to infer the bright and dark levels from the
mjr 82:4f6209cb5c33 523 // image, since this lets us adapt automatically to different exposure
mjr 82:4f6209cb5c33 524 // levels. The exposure level can vary by integration time and the
mjr 82:4f6209cb5c33 525 // intensity and positioning of the light source, and we want
mjr 82:4f6209cb5c33 526 // to be as flexible as we can about both.
mjr 82:4f6209cb5c33 527 uint8_t midpt[10];
mjr 82:4f6209cb5c33 528 uint8_t midptIdx;
mjr 82:4f6209cb5c33 529
mjr 82:4f6209cb5c33 530 public:
mjr 82:4f6209cb5c33 531 };
mjr 82:4f6209cb5c33 532
mjr 82:4f6209cb5c33 533
mjr 82:4f6209cb5c33 534 #endif /* _EDGESENSOR_H_ */