It was always going to be just a matter of time before automating avian and reptile CBCs became a possibility. But for decades, it wasn’t — and for a list of reasons. The specific reasons are insightful, because the end product that makes up Moichor's technology is the sum of the innovations in each of these technologies.
Advances in the accessibility of optics, miniaturization of chips and sensors, connectivity of devices, sophistication of computer vision, and the associated dropping costs of each of the aforementioned variables, have made it possible to design and bring this point of care device to market.
Only 10 years ago if you wanted a specialized lens, you needed to have a contact and connect directly with manufacturers in Japan or Germany. Objectives were ground and shaped manually and would have cost upward of $50k for the lenses alone.
Since then more countries have started manufacturing lenses using different materials that enable higher-throughput lens manufacturing, which has drastically brought down the cost of lenses by several orders of magnitude.
Additionally, improvements in laser and sensor technology have enabled automated methods of manufacturing lenses. So, whereas previously, high-cost lenses were only used for R&D purposes, now their application can be applied for diagnostic and point of care devices in a way that would have otherwise been cost prohibitive.
Remember the phone you had in 2008 before you had a smartphone? The microprocessor is basically the thing that made your phone smart. Sure, the Nokia you had also had a microprocessor. But innovations in nanofabrication enabled sophisticated microprocessors that were once only able to fit into a computer, to now fit in a phone or a wearable.
That nanofabrication technology also enabled the miniaturization of the sensors, motors, and other components that make it possible to fit the Moichor microscope into a point of care-sized device.
Only, five years ago, the components of a device like the one Moichor has built, would have cost three to four orders of magnitude more to develop and would have filled an environmentally-controlled room.
Remember over the past 10 years when words like ‘cloud computing’, ‘big data’, and ‘the internet of things (IoT) had us scratching our heads and watching multiple youtube videos of people explaining what these words actually meant?
Those were part of the connectivity that have become the foundation that enable technology companies like Moichor to build sophisticated software and hardware.
Most importantly, the major benefit of connectivity for a point of care device is that it is able to just be a conduit for accessing more sophisticated software in the cloud.
In the same way that using Siri or Alexa requires your phone or device to be connected to the internet to enable the software to address your request, Moichor’s cloud infrastructure enables your slide to be compared with thousands of others within a central knowledge base. And in the case that the sample or species is especially unique, it gets sent directly to experienced pathologists like Kyle Webb, DVM, DACVP.
A decade ago before the ubiquity of IoT and the connecting of everything that could be connected, medical devices were hard-coded. With a hard-coded device, all of the computation takes place within that one device. Unless the device is regularly updated with new software, its computational capabilities remain the same — bugs and all.
Moichor’s in-clinic device wouldn’t have worked very well if it were hard-coded because it wouldn’t be able to access the volume of other samples to compare with, or easily pass a difficult sample to a pathologist for that matter. There are literally dozens of other reasons why connectivity creates a gateway for a whole realm of possibilities for in-clinic devices. But one of the most important reasons is that it enables computer vision.
A bit over 10 years ago, improvements in computer hardware jump-started the exploration of more complex computer vision problems.
A super computer was no longer necessary to answer the question: “Is this an image of a cat?” Moreover, this was now something that could be achieved on a personal computer enabling accessibility for a broader group of researchers.
The sophistication of the problems computer vision could address grew. Now for example, a computer could determine if there was a cat, a dog, or a horse in the image — or none. These problems were being explored in an academic environment but also, there were commercial applications that began to emerge, for technologies like self-driving cars or image classification for YouTube or Google images.
By the middle of the decade, computer vision was able to identify where a cat was in an image, the size of the cat, and the way the cat was posed. Around this time, facial recognition was becoming mainstream. Remember when Facebook started auto-tagging everyone in your photos?
Around this time, experts in the field of computer vision started to bring some of the methodologies from non-medical fields into the biomedical space.
In 2018, the founders of Moichor recognized these computer vision methods could be applied to counting blood cells using imaging.
Since then, the field of computer vision has become much more sophisticated. It can now describe what’s happening in an image for example.
As the technology grows increasingly sophisticated, Moichor continues to apply the latest methodologies to continuously improve the performance and precision of its methods.
While each of these domains has progressed, what makes Moichor’s story unique has been our method for applying each of these innovations in a way that builds a new higher standard without disrupting the way veterinarians conduct their business.
The goal is to create a seamless shift to actionable insights and point of care device we will be showcasing at ExoticsCon and WVC represent a major step towards that goal.