I'm not 100% sure I'm in the right place to report this, but: I'm testing the Demo handwriting recognition ( https://azure.microsoft.com/en-ca/services/cognitive-services/computer-vision/#handwriting ) on some vintage postcards.
I've found that I can submit an image and get no results, or nonsense results back. But if I submit the negative version of the exact same image then the results are perfect.
Original image: https://1drv.ms/u/s!As3ZytCDCPbLkIJdz5P2TCj7nRijHA
Color inverted image: https://1drv.ms/u/s!As3ZytCDCPbLkIJcytVrtB0OrmDHTg
(In this case, the text is typed but I've been using the handwriting demo because they are often hand printed.)
I thought this might provide an avenue for improvement, i.e., if the service also processed the negative of the submission and compared confidences.