Tensorflow Mobilenet generates same prediction

17 views Asked by At

0

I am working on a native react application which will make predictions using the tensorflow mobilenet model.

Currently, I am able to load the model and format the image for prediction. The result of the prediction is always the same regardless of the image used.

If anyone has run into a similar issue and found a solution please let me now or if you can point me in the right direction.

output

[{"className": "tench, Tinca tinca", "probability": 0}, {"className": "goldfish, Carassius auratus", "probability": 0}, {"className": "great white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias", "probability": 0}]

code to format an image and classify it.

useEffect(() => {
  const fetchData = async () => {
    await tf.ready()
    const model = await mobilenet.load();
    // console.log(model);
    const image = require('./cat_224_224.jpg');
    const imageAssetPath = Image.resolveAssetSource(image);
    const response = await fetch(imageAssetPath.uri, {}, { isBinary: true });
    const imageDataArrayBuffer = await response.arrayBuffer();
    const imageData = new Uint8Array(imageDataArrayBuffer);
    const imageTensor = decodeJpeg(imageData);
    const floatImage = tf.cast(imageTensor, 'float32');
    const predictions = await model.classify(floatImage);
    console.log(predictions);
  }
  fetchData()
    .catch(console.error);
}, [])
0

There are 0 answers