You can get a pretty good generalisation if you know what the food is. How do you think current apps for tracking nutrition work? All that this will do is just try and figure out what the food is from the picture rather than the user typing it in. Most foods you can tell what it is without “looking inside”. I’m pretty sure there’s apps that do that now, this isn’t something new and groundbreaking.
And for nutrition you don’t need to be 100% exact when tracking it. Because you can’t be 100% even if you do know exact ingredients and how much of each one. Everything always has a variance. This method doesn’t need to be perfect for it to meet the needs of most that will use it.
I agree that you can get a generic value of nutrition from a photo of a simple, fruit or vegetable, but since a pie/cake contains soo much stuff that looks identical to other stuff, rendering any photographic analysis useless.
So yes, you can get some idea of the nutrition of some foods, but way too low to be useful.
Re-read the first one I sent.
You can get a pretty good generalisation if you know what the food is. How do you think current apps for tracking nutrition work? All that this will do is just try and figure out what the food is from the picture rather than the user typing it in. Most foods you can tell what it is without “looking inside”. I’m pretty sure there’s apps that do that now, this isn’t something new and groundbreaking.
And for nutrition you don’t need to be 100% exact when tracking it. Because you can’t be 100% even if you do know exact ingredients and how much of each one. Everything always has a variance. This method doesn’t need to be perfect for it to meet the needs of most that will use it.
I agree that you can get a generic value of nutrition from a photo of a simple, fruit or vegetable, but since a pie/cake contains soo much stuff that looks identical to other stuff, rendering any photographic analysis useless.
So yes, you can get some idea of the nutrition of some foods, but way too low to be useful.