Mirror with some correction

Dependencies:   mbed FastIO FastPWM USBDevice

Committer:
mjr
Date:
Fri Apr 21 18:50:37 2017 +0000
Revision:
86:e30a1f60f783
Parent:
82:4f6209cb5c33
Child:
87:8d35c74403af
Capture a bunch of alternative bar code decoder tests, mostly unsuccessful

Who changed what in which revision?

UserRevisionLine numberNew contents of line
mjr 82:4f6209cb5c33 1 // Edge position sensor - 2D optical
mjr 82:4f6209cb5c33 2 //
mjr 82:4f6209cb5c33 3 // This class implements our plunger sensor interface using edge
mjr 82:4f6209cb5c33 4 // detection on a 2D optical sensor. With this setup, a 2D optical
mjr 82:4f6209cb5c33 5 // sensor is placed close to the plunger, parallel to the rod, with a
mjr 82:4f6209cb5c33 6 // light source opposite the plunger. This makes the plunger cast a
mjr 82:4f6209cb5c33 7 // shadow on the sensor. We figure the plunger position by detecting
mjr 82:4f6209cb5c33 8 // where the shadow is, by finding the edge between the bright and
mjr 82:4f6209cb5c33 9 // dark regions in the image.
mjr 82:4f6209cb5c33 10 //
mjr 82:4f6209cb5c33 11 // This class is designed to work with any type of 2D optical sensor.
mjr 82:4f6209cb5c33 12 // We have subclasses for the TSL1410R and TSL1412S sensors, but other
mjr 82:4f6209cb5c33 13 // similar sensors could be supported as well by adding interfaces for
mjr 82:4f6209cb5c33 14 // the physical electronics. For the edge detection, we just need an
mjr 82:4f6209cb5c33 15 // array of pixel readings.
mjr 82:4f6209cb5c33 16
mjr 82:4f6209cb5c33 17 #ifndef _EDGESENSOR_H_
mjr 82:4f6209cb5c33 18 #define _EDGESENSOR_H_
mjr 82:4f6209cb5c33 19
mjr 82:4f6209cb5c33 20 #include "plunger.h"
mjr 82:4f6209cb5c33 21 #include "tsl14xxSensor.h"
mjr 82:4f6209cb5c33 22
mjr 82:4f6209cb5c33 23 // Scan method - select a method listed below. Method 2 (find the point
mjr 82:4f6209cb5c33 24 // with maximum brightness slop) seems to work the best so far.
mjr 82:4f6209cb5c33 25 #define SCAN_METHOD 2
mjr 82:4f6209cb5c33 26 //
mjr 82:4f6209cb5c33 27 //
mjr 82:4f6209cb5c33 28 // 0 = One-way scan. This is the original algorithim from the v1 software,
mjr 82:4f6209cb5c33 29 // with some slight improvements. We start at the brighter end of the
mjr 82:4f6209cb5c33 30 // sensor and scan until we find a pixel darker than a threshold level
mjr 82:4f6209cb5c33 31 // (halfway between the respective brightness levels at the bright and
mjr 82:4f6209cb5c33 32 // dark ends of the sensor). The original v1 algorithm simply stopped
mjr 82:4f6209cb5c33 33 // there. This version is slightly improved: it scans for a few more
mjr 82:4f6209cb5c33 34 // pixels to make sure that the majority of the adjacent pixels are
mjr 82:4f6209cb5c33 35 // also in shadow, to help reject false edges from sensor noise or
mjr 82:4f6209cb5c33 36 // optical shadows that make one pixel read darker than it should.
mjr 82:4f6209cb5c33 37 //
mjr 82:4f6209cb5c33 38 // 1 = Meet in the middle. We start two scans concurrently, one from
mjr 82:4f6209cb5c33 39 // the dark end of the sensor and one from the bright end. For
mjr 82:4f6209cb5c33 40 // the scan from the dark end, we stop when we reach a pixel that's
mjr 82:4f6209cb5c33 41 // brighter than the average dark level by 2/3 of the gap between
mjr 82:4f6209cb5c33 42 // the dark and bright levels. For the scan from the bright end,
mjr 82:4f6209cb5c33 43 // we stop when we reach a pixel that's darker by 2/3 of the gap.
mjr 82:4f6209cb5c33 44 // Each time we stop, we look to see if the other scan has reached
mjr 82:4f6209cb5c33 45 // the same place. If so, the two scans converged on a common
mjr 82:4f6209cb5c33 46 // point, which we take to be the edge between the dark and bright
mjr 82:4f6209cb5c33 47 // sections. If the two scans haven't converged yet, we switch to
mjr 82:4f6209cb5c33 48 // the other scan and continue it. We repeat this process until
mjr 82:4f6209cb5c33 49 // the two converge. The benefit of this approach vs the older
mjr 82:4f6209cb5c33 50 // one-way scan is that it's much more tolerant of noise, and the
mjr 82:4f6209cb5c33 51 // degree of noise tolerance is dictated by how noisy the signal
mjr 82:4f6209cb5c33 52 // actually is. The dynamic degree of tolerance is good because
mjr 82:4f6209cb5c33 53 // higher noise tolerance tends to result in reduced resolution.
mjr 82:4f6209cb5c33 54 //
mjr 82:4f6209cb5c33 55 // 2 = Maximum dL/ds (highest first derivative of luminance change per
mjr 82:4f6209cb5c33 56 // distance, or put another way, the steepest brightness slope).
mjr 82:4f6209cb5c33 57 // This scans the whole image and looks for the position with the
mjr 82:4f6209cb5c33 58 // highest dL/ds value. We average over a window of several pixels,
mjr 82:4f6209cb5c33 59 // to smooth out pixel noise; this should avoid treating a single
mjr 82:4f6209cb5c33 60 // spiky pixel as having a steep slope adjacent to it. The advantage
mjr 82:4f6209cb5c33 61 // in this approach is that it looks for the strongest edge after
mjr 82:4f6209cb5c33 62 // considering all edges across the whole image, which should make
mjr 82:4f6209cb5c33 63 // it less likely to be fooled by isolated noise that creates a
mjr 82:4f6209cb5c33 64 // single false edge. Algorithms 1 and 2 have basically fixed
mjr 82:4f6209cb5c33 65 // thresholds for what constitutes an edge, but this approach is
mjr 82:4f6209cb5c33 66 // more dynamic in that it evaluates each edge-like region and picks
mjr 82:4f6209cb5c33 67 // the best one. The width of the edge is still fixed, since that's
mjr 82:4f6209cb5c33 68 // determined by the pixel window. But that should be okay since we
mjr 82:4f6209cb5c33 69 // only deal with one type of image. It should be possible to adjust
mjr 82:4f6209cb5c33 70 // the light source and sensor position to always yield an image with
mjr 82:4f6209cb5c33 71 // a narrow enough edge region.
mjr 82:4f6209cb5c33 72 //
mjr 82:4f6209cb5c33 73 // The max dL/ds method is the most compute-intensive method, because
mjr 82:4f6209cb5c33 74 // of the pixel window averaging. An assembly language implemementation
mjr 82:4f6209cb5c33 75 // seems to be needed to make it fast enough on the KL25Z. This method
mjr 82:4f6209cb5c33 76 // has a fixed run time because it always does exactly one pass over
mjr 82:4f6209cb5c33 77 // the whole pixel array.
mjr 82:4f6209cb5c33 78 //
mjr 82:4f6209cb5c33 79 // 3 = Total bright pixel count. This simply adds up the total number
mjr 82:4f6209cb5c33 80 // of pixels above a threshold brightness, without worrying about
mjr 82:4f6209cb5c33 81 // whether they're contiguous with other pixels on the same side
mjr 82:4f6209cb5c33 82 // of the edge. Since we know there's always exactly one edge,
mjr 82:4f6209cb5c33 83 // all of the dark pixels should in principle be on one side, and
mjr 82:4f6209cb5c33 84 // all of the light pixels should be on the other side. There
mjr 82:4f6209cb5c33 85 // might be some noise that creates isolated pixels that don't
mjr 82:4f6209cb5c33 86 // match their neighbors, but these should average out. The virtue
mjr 82:4f6209cb5c33 87 // of this approach (apart from its simplicity) is that it should
mjr 82:4f6209cb5c33 88 // be immune to false edges - local spikes due to noise - that
mjr 82:4f6209cb5c33 89 // might fool the algorithms that explicitly look for edges. In
mjr 82:4f6209cb5c33 90 // practice, though, it seems to be even more sensitive to noise
mjr 82:4f6209cb5c33 91 // than the other algorithms, probably because it treats every pixel
mjr 82:4f6209cb5c33 92 // as independent and thus doesn't have any sort of inherent noise
mjr 82:4f6209cb5c33 93 // reduction from considering relationships among pixels.
mjr 82:4f6209cb5c33 94 //
mjr 82:4f6209cb5c33 95
mjr 82:4f6209cb5c33 96 // assembler routine to scan for an edge using "mode 2" (maximum slope)
mjr 82:4f6209cb5c33 97 extern "C" int edgeScanMode2(
mjr 82:4f6209cb5c33 98 const uint8_t *pix, int npix, const uint8_t **edgePtr, int dir);
mjr 82:4f6209cb5c33 99
mjr 82:4f6209cb5c33 100 // PlungerSensor interface implementation for edge detection setups.
mjr 82:4f6209cb5c33 101 // This is a generic base class for image-based sensors where we detect
mjr 82:4f6209cb5c33 102 // the plunger position by finding the edge of the shadow it casts on
mjr 82:4f6209cb5c33 103 // the detector.
mjr 82:4f6209cb5c33 104 class PlungerSensorEdgePos
mjr 82:4f6209cb5c33 105 {
mjr 82:4f6209cb5c33 106 public:
mjr 82:4f6209cb5c33 107 PlungerSensorEdgePos(int npix)
mjr 82:4f6209cb5c33 108 {
mjr 82:4f6209cb5c33 109 native_npix = npix;
mjr 82:4f6209cb5c33 110 }
mjr 82:4f6209cb5c33 111
mjr 82:4f6209cb5c33 112 // Process an image - scan for the shadow edge to determine the plunger
mjr 82:4f6209cb5c33 113 // position.
mjr 82:4f6209cb5c33 114 //
mjr 82:4f6209cb5c33 115 // If we detect the plunger position, we set 'pos' to the pixel location
mjr 82:4f6209cb5c33 116 // of the edge and return true; otherwise we return false. The 'pos'
mjr 82:4f6209cb5c33 117 // value returned, if any, is adjusted for sensor orientation so that
mjr 82:4f6209cb5c33 118 // it reflects the logical plunger position (i.e., distance retracted,
mjr 82:4f6209cb5c33 119 // where 0 is always the fully forward position and 'n' is fully
mjr 82:4f6209cb5c33 120 // retracted).
mjr 82:4f6209cb5c33 121
mjr 82:4f6209cb5c33 122 #if SCAN_METHOD == 0
mjr 82:4f6209cb5c33 123 // Scan method 0: one-way scan; original method used in v1 firmware.
mjr 82:4f6209cb5c33 124 bool process(const uint8_t *pix, int n, int &pos)
mjr 82:4f6209cb5c33 125 {
mjr 82:4f6209cb5c33 126 // Get the levels at each end
mjr 82:4f6209cb5c33 127 int a = (int(pix[0]) + pix[1] + pix[2] + pix[3] + pix[4])/5;
mjr 82:4f6209cb5c33 128 int b = (int(pix[n-1]) + pix[n-2] + pix[n-3] + pix[n-4] + pix[n-5])/5;
mjr 82:4f6209cb5c33 129
mjr 82:4f6209cb5c33 130 // Figure the sensor orientation based on the relative brightness
mjr 82:4f6209cb5c33 131 // levels at the opposite ends of the image. We're going to scan
mjr 82:4f6209cb5c33 132 // across the image from each side - 'bi' is the starting index
mjr 82:4f6209cb5c33 133 // scanning from the bright side, 'di' is the starting index on
mjr 82:4f6209cb5c33 134 // the dark side. 'binc' and 'dinc' are the pixel increments
mjr 82:4f6209cb5c33 135 // for the respective indices.
mjr 82:4f6209cb5c33 136 int bi;
mjr 82:4f6209cb5c33 137 if (a > b+10)
mjr 82:4f6209cb5c33 138 {
mjr 82:4f6209cb5c33 139 // left end is brighter - standard orientation
mjr 82:4f6209cb5c33 140 dir = 1;
mjr 82:4f6209cb5c33 141 bi = 4;
mjr 82:4f6209cb5c33 142 }
mjr 82:4f6209cb5c33 143 else if (b > a+10)
mjr 82:4f6209cb5c33 144 {
mjr 82:4f6209cb5c33 145 // right end is brighter - reverse orientation
mjr 82:4f6209cb5c33 146 dir = -1;
mjr 82:4f6209cb5c33 147 bi = n - 5;
mjr 82:4f6209cb5c33 148 }
mjr 82:4f6209cb5c33 149 else if (dir != 0)
mjr 82:4f6209cb5c33 150 {
mjr 82:4f6209cb5c33 151 // We don't have enough contrast to detect the orientation
mjr 82:4f6209cb5c33 152 // from this image, so either the image is too overexposed
mjr 82:4f6209cb5c33 153 // or underexposed to be useful, or the entire sensor is in
mjr 82:4f6209cb5c33 154 // light or darkness. We'll assume the latter: the plunger
mjr 82:4f6209cb5c33 155 // is blocking the whole window or isn't in the frame at
mjr 82:4f6209cb5c33 156 // all. We'll also assume that the exposure level is
mjr 82:4f6209cb5c33 157 // similar to that in recent frames where we *did* detect
mjr 82:4f6209cb5c33 158 // the direction. This means that if the new exposure level
mjr 82:4f6209cb5c33 159 // (which is about the same over the whole array) is less
mjr 82:4f6209cb5c33 160 // than the recent midpoint, we must be entirely blocked
mjr 82:4f6209cb5c33 161 // by the plunger, so it's all the way forward; if the
mjr 82:4f6209cb5c33 162 // brightness is above the recent midpoint, we must be
mjr 82:4f6209cb5c33 163 // entirely exposed, so the plunger is all the way back.
mjr 82:4f6209cb5c33 164
mjr 82:4f6209cb5c33 165 // figure the average of the recent midpoint brightnesses
mjr 82:4f6209cb5c33 166 int sum = 0;
mjr 82:4f6209cb5c33 167 for (int i = 0 ; i < countof(midpt) ; sum += midpt[i++]) ;
mjr 82:4f6209cb5c33 168 sum /= countof(midpt);
mjr 82:4f6209cb5c33 169
mjr 82:4f6209cb5c33 170 // Figure the average of our two ends. We have very
mjr 82:4f6209cb5c33 171 // little contrast overall, so we already know that the
mjr 82:4f6209cb5c33 172 // two ends are about the same, but we can't expect the
mjr 82:4f6209cb5c33 173 // lighting to be perfectly uniform. Averaging the ends
mjr 82:4f6209cb5c33 174 // will smooth out variations due to light source placement,
mjr 82:4f6209cb5c33 175 // sensor noise, etc.
mjr 82:4f6209cb5c33 176 a = (a+b)/2;
mjr 82:4f6209cb5c33 177
mjr 82:4f6209cb5c33 178 // Check if we seem to be fully exposed or fully covered
mjr 82:4f6209cb5c33 179 pos = a < sum ? 0 : n;
mjr 82:4f6209cb5c33 180 return true;
mjr 82:4f6209cb5c33 181 }
mjr 82:4f6209cb5c33 182 else
mjr 82:4f6209cb5c33 183 {
mjr 82:4f6209cb5c33 184 // We can't detect the orientation from this image, and
mjr 82:4f6209cb5c33 185 // we don't know it from previous images, so we have nothing
mjr 82:4f6209cb5c33 186 // to go on. Give up and return failure.
mjr 82:4f6209cb5c33 187 return false;
mjr 82:4f6209cb5c33 188 }
mjr 82:4f6209cb5c33 189
mjr 82:4f6209cb5c33 190 // Figure the crossover brightness levels for detecting the edge.
mjr 82:4f6209cb5c33 191 // The midpoint is the brightness level halfway between the bright
mjr 82:4f6209cb5c33 192 // and dark regions we detected at the opposite ends of the sensor.
mjr 82:4f6209cb5c33 193 // To find the edge, we'll look for a brightness level slightly
mjr 82:4f6209cb5c33 194 // *past* the midpoint, to help reject noise - the bright region
mjr 82:4f6209cb5c33 195 // pixels should all cluster close to the higher level, and the
mjr 82:4f6209cb5c33 196 // shadow region should all cluster close to the lower level.
mjr 82:4f6209cb5c33 197 // We'll define "close" as within 1/3 of the gap between the
mjr 82:4f6209cb5c33 198 // extremes.
mjr 82:4f6209cb5c33 199 int mid = (a+b)/2;
mjr 82:4f6209cb5c33 200
mjr 82:4f6209cb5c33 201 // Scan from the bright side looking, for a pixel that drops below the
mjr 82:4f6209cb5c33 202 // midpoint brightess. To reduce false positives from noise, check to
mjr 82:4f6209cb5c33 203 // see if the majority of the next few pixels stay in shadow - if not,
mjr 82:4f6209cb5c33 204 // consider the dark pixel to be some kind of transient noise, and
mjr 82:4f6209cb5c33 205 // continue looking for a more solid edge.
mjr 82:4f6209cb5c33 206 for (int i = 5 ; i < n-5 ; ++i, bi += dir)
mjr 82:4f6209cb5c33 207 {
mjr 82:4f6209cb5c33 208 // check to see if we found a dark pixel
mjr 82:4f6209cb5c33 209 if (pix[bi] < mid)
mjr 82:4f6209cb5c33 210 {
mjr 82:4f6209cb5c33 211 // make sure we have a sustained edge
mjr 82:4f6209cb5c33 212 int ok = 0;
mjr 82:4f6209cb5c33 213 int bi2 = bi + dir;
mjr 82:4f6209cb5c33 214 for (int j = 0 ; j < 5 ; ++j, bi2 += dir)
mjr 82:4f6209cb5c33 215 {
mjr 82:4f6209cb5c33 216 // count this pixel if it's darker than the midpoint
mjr 82:4f6209cb5c33 217 if (pix[bi2] < mid)
mjr 82:4f6209cb5c33 218 ++ok;
mjr 82:4f6209cb5c33 219 }
mjr 82:4f6209cb5c33 220
mjr 82:4f6209cb5c33 221 // if we're clearly in the dark section, we have our edge
mjr 82:4f6209cb5c33 222 if (ok > 3)
mjr 82:4f6209cb5c33 223 {
mjr 82:4f6209cb5c33 224 // Success. Since we found an edge in this scan, save the
mjr 82:4f6209cb5c33 225 // midpoint brightness level in our history list, to help
mjr 82:4f6209cb5c33 226 // with any future frames with insufficient contrast.
mjr 82:4f6209cb5c33 227 midpt[midptIdx++] = mid;
mjr 82:4f6209cb5c33 228 midptIdx %= countof(midpt);
mjr 82:4f6209cb5c33 229
mjr 82:4f6209cb5c33 230 // return the detected position
mjr 82:4f6209cb5c33 231 pos = i;
mjr 82:4f6209cb5c33 232 return true;
mjr 82:4f6209cb5c33 233 }
mjr 82:4f6209cb5c33 234 }
mjr 82:4f6209cb5c33 235 }
mjr 82:4f6209cb5c33 236
mjr 82:4f6209cb5c33 237 // no edge found
mjr 82:4f6209cb5c33 238 return false;
mjr 82:4f6209cb5c33 239 }
mjr 82:4f6209cb5c33 240 #endif // SCAN_METHOD 0
mjr 82:4f6209cb5c33 241
mjr 82:4f6209cb5c33 242 #if SCAN_METHOD == 1
mjr 82:4f6209cb5c33 243 // Scan method 1: meet in the middle.
mjr 82:4f6209cb5c33 244 bool process(const uint8_t *pix, int n, int &pos)
mjr 82:4f6209cb5c33 245 {
mjr 82:4f6209cb5c33 246 // Get the levels at each end
mjr 82:4f6209cb5c33 247 int a = (int(pix[0]) + pix[1] + pix[2] + pix[3] + pix[4])/5;
mjr 82:4f6209cb5c33 248 int b = (int(pix[n-1]) + pix[n-2] + pix[n-3] + pix[n-4] + pix[n-5])/5;
mjr 82:4f6209cb5c33 249
mjr 82:4f6209cb5c33 250 // Figure the sensor orientation based on the relative brightness
mjr 82:4f6209cb5c33 251 // levels at the opposite ends of the image. We're going to scan
mjr 82:4f6209cb5c33 252 // across the image from each side - 'bi' is the starting index
mjr 82:4f6209cb5c33 253 // scanning from the bright side, 'di' is the starting index on
mjr 82:4f6209cb5c33 254 // the dark side. 'binc' and 'dinc' are the pixel increments
mjr 82:4f6209cb5c33 255 // for the respective indices.
mjr 82:4f6209cb5c33 256 int bi, di;
mjr 82:4f6209cb5c33 257 int binc, dinc;
mjr 82:4f6209cb5c33 258 if (a > b+10)
mjr 82:4f6209cb5c33 259 {
mjr 82:4f6209cb5c33 260 // left end is brighter - standard orientation
mjr 82:4f6209cb5c33 261 dir = 1;
mjr 82:4f6209cb5c33 262 bi = 4, di = n - 5;
mjr 82:4f6209cb5c33 263 binc = 1, dinc = -1;
mjr 82:4f6209cb5c33 264 }
mjr 82:4f6209cb5c33 265 else if (b > a+10)
mjr 82:4f6209cb5c33 266 {
mjr 82:4f6209cb5c33 267 // right end is brighter - reverse orientation
mjr 82:4f6209cb5c33 268 dir = -1;
mjr 82:4f6209cb5c33 269 bi = n - 5, di = 4;
mjr 82:4f6209cb5c33 270 binc = -1, dinc = 1;
mjr 82:4f6209cb5c33 271 }
mjr 82:4f6209cb5c33 272 else
mjr 82:4f6209cb5c33 273 {
mjr 82:4f6209cb5c33 274 // can't detect direction
mjr 82:4f6209cb5c33 275 return false;
mjr 82:4f6209cb5c33 276 }
mjr 82:4f6209cb5c33 277
mjr 82:4f6209cb5c33 278 // Figure the crossover brightness levels for detecting the edge.
mjr 82:4f6209cb5c33 279 // The midpoint is the brightness level halfway between the bright
mjr 82:4f6209cb5c33 280 // and dark regions we detected at the opposite ends of the sensor.
mjr 82:4f6209cb5c33 281 // To find the edge, we'll look for a brightness level slightly
mjr 82:4f6209cb5c33 282 // *past* the midpoint, to help reject noise - the bright region
mjr 82:4f6209cb5c33 283 // pixels should all cluster close to the higher level, and the
mjr 82:4f6209cb5c33 284 // shadow region should all cluster close to the lower level.
mjr 82:4f6209cb5c33 285 // We'll define "close" as within 1/3 of the gap between the
mjr 82:4f6209cb5c33 286 // extremes.
mjr 82:4f6209cb5c33 287 int mid = (a+b)/2;
mjr 82:4f6209cb5c33 288 int delta6 = abs(a-b)/6;
mjr 82:4f6209cb5c33 289 int crossoverHi = mid + delta6;
mjr 82:4f6209cb5c33 290 int crossoverLo = mid - delta6;
mjr 82:4f6209cb5c33 291
mjr 82:4f6209cb5c33 292 // Scan inward from the each end, looking for edges. Each time we
mjr 82:4f6209cb5c33 293 // find an edge from one direction, we'll see if the scan from the
mjr 82:4f6209cb5c33 294 // other direction agrees. If it does, we have a winner. If they
mjr 82:4f6209cb5c33 295 // don't agree, we must have found some noise in one direction or the
mjr 82:4f6209cb5c33 296 // other, so switch sides and continue the scan. On each continued
mjr 82:4f6209cb5c33 297 // scan, if the stopping point from the last scan *was* noise, we'll
mjr 82:4f6209cb5c33 298 // start seeing the expected non-edge pixels again as we move on,
mjr 82:4f6209cb5c33 299 // so we'll effectively factor out the noise. If what stopped us
mjr 82:4f6209cb5c33 300 // *wasn't* noise but was a legitimate edge, we'll see that we're
mjr 82:4f6209cb5c33 301 // still in the region that stopped us in the first place and just
mjr 82:4f6209cb5c33 302 // stop again immediately.
mjr 82:4f6209cb5c33 303 //
mjr 82:4f6209cb5c33 304 // The two sides have to converge, because they march relentlessly
mjr 82:4f6209cb5c33 305 // towards each other until they cross. Even if we have a totally
mjr 82:4f6209cb5c33 306 // random bunch of pixels, the two indices will eventually meet and
mjr 82:4f6209cb5c33 307 // we'll declare that to be the edge position. The processing time
mjr 82:4f6209cb5c33 308 // is linear in the pixel count - it's equivalent to one pass over
mjr 82:4f6209cb5c33 309 // the pixels. The measured time for 1280 pixels is about 1.3ms,
mjr 82:4f6209cb5c33 310 // which is about half the DMA transfer time. Our goal is always
mjr 82:4f6209cb5c33 311 // to complete the processing in less than the DMA transfer time,
mjr 82:4f6209cb5c33 312 // since that's as fast as we can possibly go with the physical
mjr 82:4f6209cb5c33 313 // sensor. Since our processing time is overlapped with the DMA
mjr 82:4f6209cb5c33 314 // transfer, the overall frame rate is limited by the *longer* of
mjr 82:4f6209cb5c33 315 // the two times, not the sum of the two times. So as long as the
mjr 82:4f6209cb5c33 316 // processing takes less time than the DMA transfer, we're not
mjr 82:4f6209cb5c33 317 // contributing at all to the overall frame rate limit - it's like
mjr 82:4f6209cb5c33 318 // we're not even here.
mjr 82:4f6209cb5c33 319 for (;;)
mjr 82:4f6209cb5c33 320 {
mjr 82:4f6209cb5c33 321 // scan from the bright side
mjr 82:4f6209cb5c33 322 for (bi += binc ; bi >= 5 && bi <= n-6 ; bi += binc)
mjr 82:4f6209cb5c33 323 {
mjr 82:4f6209cb5c33 324 // if we found a dark pixel, consider it to be an edge
mjr 82:4f6209cb5c33 325 if (pix[bi] < crossoverLo)
mjr 82:4f6209cb5c33 326 break;
mjr 82:4f6209cb5c33 327 }
mjr 82:4f6209cb5c33 328
mjr 82:4f6209cb5c33 329 // if we reached an extreme, return failure
mjr 82:4f6209cb5c33 330 if (bi < 5 || bi > n-6)
mjr 82:4f6209cb5c33 331 return false;
mjr 82:4f6209cb5c33 332
mjr 82:4f6209cb5c33 333 // if the two directions crossed, we have a winner
mjr 82:4f6209cb5c33 334 if (binc > 0 ? bi >= di : bi <= di)
mjr 82:4f6209cb5c33 335 {
mjr 82:4f6209cb5c33 336 pos = (dir == 1 ? bi : n - bi);
mjr 82:4f6209cb5c33 337 return true;
mjr 82:4f6209cb5c33 338 }
mjr 82:4f6209cb5c33 339
mjr 82:4f6209cb5c33 340 // they haven't converged yet, so scan from the dark side
mjr 82:4f6209cb5c33 341 for (di += dinc ; di >= 5 && di <= n-6 ; di += dinc)
mjr 82:4f6209cb5c33 342 {
mjr 82:4f6209cb5c33 343 // if we found a bright pixel, consider it to be an edge
mjr 82:4f6209cb5c33 344 if (pix[di] > crossoverHi)
mjr 82:4f6209cb5c33 345 break;
mjr 82:4f6209cb5c33 346 }
mjr 82:4f6209cb5c33 347
mjr 82:4f6209cb5c33 348 // if we reached an extreme, return failure
mjr 82:4f6209cb5c33 349 if (di < 5 || di > n-6)
mjr 82:4f6209cb5c33 350 return false;
mjr 82:4f6209cb5c33 351
mjr 82:4f6209cb5c33 352 // if they crossed now, we have a winner
mjr 82:4f6209cb5c33 353 if (binc > 0 ? bi >= di : bi <= di)
mjr 82:4f6209cb5c33 354 {
mjr 82:4f6209cb5c33 355 pos = (dir == 1 ? di : n - di);
mjr 82:4f6209cb5c33 356 return true;
mjr 82:4f6209cb5c33 357 }
mjr 82:4f6209cb5c33 358 }
mjr 82:4f6209cb5c33 359 }
mjr 82:4f6209cb5c33 360 #endif // SCAN METHOD 1
mjr 82:4f6209cb5c33 361
mjr 82:4f6209cb5c33 362 #if SCAN_METHOD == 2
mjr 82:4f6209cb5c33 363 // Scan method 2: scan for steepest brightness slope.
mjr 82:4f6209cb5c33 364 bool process(const uint8_t *pix, int n, int &pos)
mjr 82:4f6209cb5c33 365 {
mjr 82:4f6209cb5c33 366 // Get the levels at each end by averaging across several pixels.
mjr 82:4f6209cb5c33 367 // Compute just the sums: don't bother dividing by the count, since
mjr 82:4f6209cb5c33 368 // the sums are equivalent to the averages as long as we know
mjr 82:4f6209cb5c33 369 // everything is multiplied by the number of samples.
mjr 82:4f6209cb5c33 370 int a = (int(pix[0]) + pix[1] + pix[2] + pix[3] + pix[4]);
mjr 82:4f6209cb5c33 371 int b = (int(pix[n-1]) + pix[n-2] + pix[n-3] + pix[n-4] + pix[n-5]);
mjr 82:4f6209cb5c33 372
mjr 82:4f6209cb5c33 373 // Figure the sensor orientation based on the relative brightness
mjr 82:4f6209cb5c33 374 // levels at the opposite ends of the image. We're going to scan
mjr 82:4f6209cb5c33 375 // across the image from each side - 'bi' is the starting index
mjr 82:4f6209cb5c33 376 // scanning from the bright side, 'di' is the starting index on
mjr 82:4f6209cb5c33 377 // the dark side. 'binc' and 'dinc' are the pixel increments
mjr 82:4f6209cb5c33 378 // for the respective indices.
mjr 82:4f6209cb5c33 379 if (a > b + 50)
mjr 82:4f6209cb5c33 380 {
mjr 82:4f6209cb5c33 381 // left end is brighter - standard orientation
mjr 82:4f6209cb5c33 382 dir = 1;
mjr 82:4f6209cb5c33 383 }
mjr 82:4f6209cb5c33 384 else if (b > a + 50)
mjr 82:4f6209cb5c33 385 {
mjr 82:4f6209cb5c33 386 // right end is brighter - reverse orientation
mjr 82:4f6209cb5c33 387 dir = -1;
mjr 82:4f6209cb5c33 388 }
mjr 82:4f6209cb5c33 389 else
mjr 82:4f6209cb5c33 390 {
mjr 82:4f6209cb5c33 391 // can't determine direction
mjr 82:4f6209cb5c33 392 return false;
mjr 82:4f6209cb5c33 393 }
mjr 82:4f6209cb5c33 394
mjr 82:4f6209cb5c33 395 // scan for the steepest edge using the assembly language
mjr 82:4f6209cb5c33 396 // implementation (since the C++ version is too slow)
mjr 82:4f6209cb5c33 397 const uint8_t *edgep = 0;
mjr 82:4f6209cb5c33 398 if (edgeScanMode2(pix, n, &edgep, dir))
mjr 82:4f6209cb5c33 399 {
mjr 82:4f6209cb5c33 400 // edgep has the pixel array pointer; convert it to an offset
mjr 82:4f6209cb5c33 401 pos = edgep - pix;
mjr 82:4f6209cb5c33 402
mjr 82:4f6209cb5c33 403 // if the sensor orientation is reversed, figure the index from
mjr 82:4f6209cb5c33 404 // the other end of the array
mjr 82:4f6209cb5c33 405 if (dir < 0)
mjr 82:4f6209cb5c33 406 pos = n - pos;
mjr 82:4f6209cb5c33 407
mjr 82:4f6209cb5c33 408 // success
mjr 82:4f6209cb5c33 409 return true;
mjr 82:4f6209cb5c33 410 }
mjr 82:4f6209cb5c33 411 else
mjr 82:4f6209cb5c33 412 {
mjr 82:4f6209cb5c33 413 // no edge found
mjr 82:4f6209cb5c33 414 return false;
mjr 82:4f6209cb5c33 415 }
mjr 82:4f6209cb5c33 416
mjr 82:4f6209cb5c33 417 }
mjr 82:4f6209cb5c33 418 #endif // SCAN_METHOD 2
mjr 82:4f6209cb5c33 419
mjr 82:4f6209cb5c33 420 #if SCAN_METHOD == 3
mjr 82:4f6209cb5c33 421 // Scan method 0: one-way scan; original method used in v1 firmware.
mjr 82:4f6209cb5c33 422 bool process(const uint8_t *pix, int n, int &pos)
mjr 82:4f6209cb5c33 423 {
mjr 82:4f6209cb5c33 424 // Get the levels at each end
mjr 82:4f6209cb5c33 425 int a = (int(pix[0]) + pix[1] + pix[2] + pix[3] + pix[4])/5;
mjr 82:4f6209cb5c33 426 int b = (int(pix[n-1]) + pix[n-2] + pix[n-3] + pix[n-4] + pix[n-5])/5;
mjr 82:4f6209cb5c33 427
mjr 82:4f6209cb5c33 428 // Figure the sensor orientation based on the relative brightness
mjr 82:4f6209cb5c33 429 // levels at the opposite ends of the image. We're going to scan
mjr 82:4f6209cb5c33 430 // across the image from each side - 'bi' is the starting index
mjr 82:4f6209cb5c33 431 // scanning from the bright side, 'di' is the starting index on
mjr 82:4f6209cb5c33 432 // the dark side. 'binc' and 'dinc' are the pixel increments
mjr 82:4f6209cb5c33 433 // for the respective indices.
mjr 82:4f6209cb5c33 434 if (a > b+10)
mjr 82:4f6209cb5c33 435 {
mjr 82:4f6209cb5c33 436 // left end is brighter - standard orientation
mjr 82:4f6209cb5c33 437 dir = 1;
mjr 82:4f6209cb5c33 438 }
mjr 82:4f6209cb5c33 439 else if (b > a+10)
mjr 82:4f6209cb5c33 440 {
mjr 82:4f6209cb5c33 441 // right end is brighter - reverse orientation
mjr 82:4f6209cb5c33 442 dir = -1;
mjr 82:4f6209cb5c33 443 }
mjr 82:4f6209cb5c33 444 else
mjr 82:4f6209cb5c33 445 {
mjr 82:4f6209cb5c33 446 // We can't detect the orientation from this image
mjr 82:4f6209cb5c33 447 return false;
mjr 82:4f6209cb5c33 448 }
mjr 82:4f6209cb5c33 449
mjr 82:4f6209cb5c33 450 // Figure the crossover brightness levels for detecting the edge.
mjr 82:4f6209cb5c33 451 // The midpoint is the brightness level halfway between the bright
mjr 82:4f6209cb5c33 452 // and dark regions we detected at the opposite ends of the sensor.
mjr 82:4f6209cb5c33 453 // To find the edge, we'll look for a brightness level slightly
mjr 82:4f6209cb5c33 454 // *past* the midpoint, to help reject noise - the bright region
mjr 82:4f6209cb5c33 455 // pixels should all cluster close to the higher level, and the
mjr 82:4f6209cb5c33 456 // shadow region should all cluster close to the lower level.
mjr 82:4f6209cb5c33 457 // We'll define "close" as within 1/3 of the gap between the
mjr 82:4f6209cb5c33 458 // extremes.
mjr 82:4f6209cb5c33 459 int mid = (a+b)/2;
mjr 82:4f6209cb5c33 460
mjr 82:4f6209cb5c33 461 // Count pixels brighter than the brightness midpoint. We assume
mjr 82:4f6209cb5c33 462 // that all of the bright pixels are contiguously within the bright
mjr 82:4f6209cb5c33 463 // region, so we simply have to count them up. Even if we have a
mjr 82:4f6209cb5c33 464 // few noisy pixels in the dark region above the midpoint, these
mjr 82:4f6209cb5c33 465 // should on average be canceled out by anomalous dark pixels in
mjr 82:4f6209cb5c33 466 // the bright region.
mjr 82:4f6209cb5c33 467 int bcnt = 0;
mjr 82:4f6209cb5c33 468 for (int i = 0 ; i < n ; ++i)
mjr 82:4f6209cb5c33 469 {
mjr 82:4f6209cb5c33 470 if (pix[i] > mid)
mjr 82:4f6209cb5c33 471 ++bcnt;
mjr 82:4f6209cb5c33 472 }
mjr 82:4f6209cb5c33 473
mjr 82:4f6209cb5c33 474 // The position is simply the size of the bright region
mjr 82:4f6209cb5c33 475 pos = bcnt;
mjr 82:4f6209cb5c33 476 if (dir < 1)
mjr 82:4f6209cb5c33 477 pos = n - pos;
mjr 82:4f6209cb5c33 478 return true;
mjr 82:4f6209cb5c33 479 }
mjr 82:4f6209cb5c33 480 #endif // SCAN_METHOD 3
mjr 82:4f6209cb5c33 481
mjr 82:4f6209cb5c33 482
mjr 82:4f6209cb5c33 483 protected:
mjr 82:4f6209cb5c33 484 // Sensor orientation. +1 means that the "tip" end - which is always
mjr 82:4f6209cb5c33 485 // the brighter end in our images - is at the 0th pixel in the array.
mjr 82:4f6209cb5c33 486 // -1 means that the tip is at the nth pixel in the array. 0 means
mjr 82:4f6209cb5c33 487 // that we haven't figured it out yet. We automatically infer this
mjr 82:4f6209cb5c33 488 // from the relative light levels at each end of the array when we
mjr 82:4f6209cb5c33 489 // successfully find a shadow edge. The reason we save the information
mjr 82:4f6209cb5c33 490 // is that we might occasionally get frames that are fully in shadow
mjr 82:4f6209cb5c33 491 // or fully in light, and we can't infer the direction from such
mjr 82:4f6209cb5c33 492 // frames. Saving the information from past frames gives us a fallback
mjr 82:4f6209cb5c33 493 // when we can't infer it from the current frame. Note that we update
mjr 82:4f6209cb5c33 494 // this each time we can infer the direction, so the device will adapt
mjr 82:4f6209cb5c33 495 // on the fly even if the user repositions the sensor while the software
mjr 82:4f6209cb5c33 496 // is running.
mjr 82:4f6209cb5c33 497 virtual int getOrientation() const { return dir; }
mjr 82:4f6209cb5c33 498 int dir;
mjr 82:4f6209cb5c33 499
mjr 82:4f6209cb5c33 500 // number of pixels
mjr 82:4f6209cb5c33 501 int native_npix;
mjr 82:4f6209cb5c33 502
mjr 82:4f6209cb5c33 503 // History of midpoint brightness levels for the last few successful
mjr 82:4f6209cb5c33 504 // scans. This is a circular buffer that we write on each scan where
mjr 82:4f6209cb5c33 505 // we successfully detect a shadow edge. (It's circular, so we
mjr 82:4f6209cb5c33 506 // effectively discard the oldest element whenever we write a new one.)
mjr 82:4f6209cb5c33 507 //
mjr 82:4f6209cb5c33 508 // We use the history in cases where we have too little contrast to
mjr 82:4f6209cb5c33 509 // detect an edge. In these cases, we assume that the entire sensor
mjr 82:4f6209cb5c33 510 // is either in shadow or light, which can happen if the plunger is at
mjr 82:4f6209cb5c33 511 // one extreme or the other such that the edge of its shadow is out of
mjr 82:4f6209cb5c33 512 // the frame. (Ideally, the sensor should be positioned so that the
mjr 82:4f6209cb5c33 513 // shadow edge is always in the frame, but it's not always possible
mjr 82:4f6209cb5c33 514 // to do this given the constrained space within a cabinet.) The
mjr 82:4f6209cb5c33 515 // history helps us decide which case we have - all shadow or all
mjr 82:4f6209cb5c33 516 // light - by letting us compare our average pixel level in this
mjr 82:4f6209cb5c33 517 // frame to the range in recent frames. This assumes that the
mjr 82:4f6209cb5c33 518 // exposure level is fairly consistent from frame to frame, which
mjr 82:4f6209cb5c33 519 // is usually true because the sensor and light source are both
mjr 82:4f6209cb5c33 520 // fixed in place.
mjr 82:4f6209cb5c33 521 //
mjr 82:4f6209cb5c33 522 // We always try first to infer the bright and dark levels from the
mjr 82:4f6209cb5c33 523 // image, since this lets us adapt automatically to different exposure
mjr 82:4f6209cb5c33 524 // levels. The exposure level can vary by integration time and the
mjr 82:4f6209cb5c33 525 // intensity and positioning of the light source, and we want
mjr 82:4f6209cb5c33 526 // to be as flexible as we can about both.
mjr 82:4f6209cb5c33 527 uint8_t midpt[10];
mjr 82:4f6209cb5c33 528 uint8_t midptIdx;
mjr 82:4f6209cb5c33 529
mjr 82:4f6209cb5c33 530 public:
mjr 82:4f6209cb5c33 531 };
mjr 82:4f6209cb5c33 532
mjr 82:4f6209cb5c33 533
mjr 82:4f6209cb5c33 534 // -------------------------------------------------------------------------
mjr 82:4f6209cb5c33 535 //
mjr 86:e30a1f60f783 536 // Edge position plunger sensor for TSL14xx-based sensors. An edge
mjr 86:e30a1f60f783 537 // detection setup requires one of the large sensors, 1410R or 1412S,
mjr 86:e30a1f60f783 538 // since we need the sensor to cover the whole extent of the physical
mjr 86:e30a1f60f783 539 // plunger's travel, which is about 3".
mjr 82:4f6209cb5c33 540 //
mjr 86:e30a1f60f783 541 // The native scale for image edge detectors is sensor pixels, since
mjr 86:e30a1f60f783 542 // we read the plunger position as the pixel location of the shadow
mjr 86:e30a1f60f783 543 // edge on the image.
mjr 86:e30a1f60f783 544 //
mjr 86:e30a1f60f783 545 class PlungerSensorEdgePosTSL14xx: public PlungerSensorTSL14xxLarge, public PlungerSensorEdgePos
mjr 82:4f6209cb5c33 546 {
mjr 82:4f6209cb5c33 547 public:
mjr 82:4f6209cb5c33 548 PlungerSensorEdgePosTSL14xx(int nativePix, PinName si, PinName clock, PinName ao)
mjr 86:e30a1f60f783 549 : PlungerSensorTSL14xxLarge(nativePix, nativePix - 1, si, clock, ao),
mjr 82:4f6209cb5c33 550 PlungerSensorEdgePos(nativePix)
mjr 82:4f6209cb5c33 551 {
mjr 82:4f6209cb5c33 552 // we don't know the direction yet
mjr 82:4f6209cb5c33 553 dir = 0;
mjr 82:4f6209cb5c33 554
mjr 82:4f6209cb5c33 555 // set the midpoint history arbitrarily to the absolute halfway point
mjr 82:4f6209cb5c33 556 memset(midpt, 127, sizeof(midpt));
mjr 82:4f6209cb5c33 557 midptIdx = 0;
mjr 86:e30a1f60f783 558
mjr 86:e30a1f60f783 559 // the native reporting scale is the pixel size of the sensor, since
mjr 86:e30a1f60f783 560 // the position is figured as the shadow location in the image
mjr 86:e30a1f60f783 561 nativeScale = nativePix;
mjr 82:4f6209cb5c33 562 }
mjr 82:4f6209cb5c33 563
mjr 86:e30a1f60f783 564 protected:
mjr 82:4f6209cb5c33 565 // process the image through the edge detector
mjr 82:4f6209cb5c33 566 virtual bool process(const uint8_t *pix, int npix, int &pixpos)
mjr 82:4f6209cb5c33 567 {
mjr 82:4f6209cb5c33 568 return PlungerSensorEdgePos::process(pix, npix, pixpos);
mjr 82:4f6209cb5c33 569 }
mjr 82:4f6209cb5c33 570 };
mjr 82:4f6209cb5c33 571
mjr 82:4f6209cb5c33 572 // TSL1410R sensor
mjr 82:4f6209cb5c33 573 class PlungerSensorTSL1410R: public PlungerSensorEdgePosTSL14xx
mjr 82:4f6209cb5c33 574 {
mjr 82:4f6209cb5c33 575 public:
mjr 82:4f6209cb5c33 576 PlungerSensorTSL1410R(PinName si, PinName clock, PinName ao)
mjr 82:4f6209cb5c33 577 : PlungerSensorEdgePosTSL14xx(1280, si, clock, ao)
mjr 82:4f6209cb5c33 578 {
mjr 82:4f6209cb5c33 579 }
mjr 82:4f6209cb5c33 580 };
mjr 82:4f6209cb5c33 581
mjr 82:4f6209cb5c33 582 // TSL1412R
mjr 82:4f6209cb5c33 583 class PlungerSensorTSL1412R: public PlungerSensorEdgePosTSL14xx
mjr 82:4f6209cb5c33 584 {
mjr 82:4f6209cb5c33 585 public:
mjr 82:4f6209cb5c33 586 PlungerSensorTSL1412R(PinName si, PinName clock, PinName ao)
mjr 82:4f6209cb5c33 587 : PlungerSensorEdgePosTSL14xx(1536, si, clock, ao)
mjr 82:4f6209cb5c33 588 {
mjr 82:4f6209cb5c33 589 }
mjr 82:4f6209cb5c33 590 };
mjr 82:4f6209cb5c33 591
mjr 82:4f6209cb5c33 592 #endif /* _EDGESENSOR_H_ */