A programmer brother, Caleb, recently had a son, but troubles also followed. The newborn baby needs to be fed every few hours, and will cry loudly at night when he is hungry. At this time, the whole family is woken up and has been tossed to sleep all night.
He consulted with experienced people around him, and the advice he got was: Just be patient and get through the first few months.
But Caleb was not reconciled as a programmer and began to use his engineer’s thinking to solve problems. Finally, through the combination of camera + AI algorithm, he came up with an automatic baby hunger detection system, which can detect the baby in time before the baby actually starts to cry.
Caleb had the program send a notification to the baby’s phone once it determined that the baby was 100 percent likely to be hungry.
In this way, he can quietly get up to feed himself, and use his technology to protect his wife’s sleep.
Such a system does not need to be developed from scratch, because the off-the-shelf human pose detection algorithms are already very mature.
For example, Caleb uses Google’s open source MediaPipe, including full body pose, face mesh and hand motion detection.
The remaining question is-
How does AI tell if a baby is hungry?
Before getting started, Caleb went to the mother-baby forum to do a lot of research.
According to the theory, crying indicates that the baby has entered the later stages of starvation. At this time, it is not easy to breastfeed directly, you need to calm the baby first.
Early signs of hunger include smacking or licking lips, opening and closing mouth repeatedly, and sucking lips, fingers, or other clothing toys.
Caleb wrote separate codes to give these behaviors different weights based on his experience observing his own children.
For example, smacking your mouth will give +10% confidence, and sticking your fist to your mouth will also give +10%.
The baby turns its head back and forth to indicate that it is looking for a food source. He observed that the frequency of his baby’s head turning varies according to the level of hunger.
So he set it up for a short period of time, and the more frequently he turned his head, the more confidence he gained.
During this period, I also encountered a situation where the baby’s pacifier would cause occlusion, and the algorithm could not accurately identify the movement of the lips.
To this end, he also retrained his own customized model on the basis of the open source algorithm, and gave the confidence according to the situation of holding a pacifier and not using a pacifier.
In the process, he also found that the baby would spit out the pacifier when he was very hungry. This action will increase the confidence by a full 30%, which means that you are about to cry.
After the system was put into use, it did bring many benefits to Caleb’s family, he concluded:
Babies are happier and adults can sleep more.
However, the story is not over yet…
Is an automatic breastfeeding system feasible?
With the preliminary results on the software, it did not satisfy the DIY soul of this old man.
Next, he linked the system with hardware and mechanical devices to try to create a fully automatic feeding system.
The idea is very bold, but the ending is still a bit out of line.
He also knows that this thing is prone to accidents, so he first finds an adult to play the baby to do the experiment.
I saw him with the pacifier in his mouth to imitate the baby’s movements and repeatedly turned his head and smacking his mouth, so that the system could increase its confidence, and finally spit out the pacifier to trigger the system alarm.
What happened next, can only say “dangerous action, please don’t imitate”.
Caleb made a video of this experience and shared it on the Internet, which attracted attention in the programmer circle and DIY circle.
He revealed that the entire development work took only about 50 hours, because the MediaPipe open source tool is already very complete.
Some netizens commented: If I were to develop this system, the child would already be able to make soy sauce when I was done.
There is a reason why Caleb can do it so fast. He is already familiar with the development process of target detection applications. He has also done detection in his own yard when a dog is puking.
Some people feel weird about his behavior.
Hungry → have to cry → get food. This should be a necessary stage for learning to interact with the surrounding environment and communicate with parents. If you can get food just by lying down, will it affect your development?
Another netizen thinks the professional advice he gets from parenting classes is to try to feed babies before they cry, so it shouldn’t be a problem.
Caleb himself also said that he and his wife will not completely rely on the algorithm to decide to feed their children in the world, but with the help of AI, they can make child-rearing more efficient.
Of course, there are more interested netizens urging him: open source quickly!
Video address:
https://www.youtube.com/watch?v=Lda1Sq8HRY4
MediaPipe
https://google.github.io/mediapipe/
Reference link:
[1]https://www.reddit.com/r/programming/comments/w58xyn/built_a_hungry_baby_alarm/
.
[related_posts_by_tax taxonomies=”post_tag”]
The post Programmer’s dad made a self-made AI feeding detector to predict that the baby is hungry and not let the crying affect the wife’s sleep – Programmer Sought appeared first on Gamingsym.