Properly display image source in the browser with JavaScript API



  • Hello,
    I'm trying to create a function to read images from Javascript API and send it to the client in the browser. It may sound a simple thing to do, however, so far I didn't achieve what my goal is. it's either I'm missing something from the docs or don't understand the concepts.
    What I already did is:
    P.S. the app I'm connected to has imageSource, imageCount and missingCount exposed as api's. And I'm using my webcam to run it.

    • Read image counts:
    const VisionAppster = require('visionappster');
    const appId = 'com.example.57c8bf60df644c40adf137bbc594fb27/1/';
    const url = 'http://localhost:2015/apis/' + appId;
    var remote = new VisionAppster.RemoteObject(url);
    async function imageCount(remote){
        var obj = await remote.connect();
        obj.$imageCount.connect((params) => console.log('received', params));
    }
    imageCount(remote)
    

    Checking my console output works fine:

    received 7707
    received 7715
    

    Now I'm trying to read an image like that:

    async function readImage(remote){ 
        var obj = await remote.connect();
        obj.$image.connect((params) => console.log('received', params.data));
    }
    readImage(remote)
    

    And the output looks something like this:

    received Uint32Array(921600) [
      4280433715, 4280498738, 4280367152, 4280300333, 4280168749,
      4280299054, 4280364080, 4280429364, 4280428598, 4280494139,
      4280559932, 4280691519, 4280757824, 4280823617, 4280889662,
      4280889660, 4280889915, 4280824120, 4280758325, 4280626739,
      4280495153, 4280560946, 4280692534, 4280824122, 4280758329,
      4280692281, 4280560184, 4280494391, 4280625460, 4280625460,
      4280625712, 4280625710, 4280429865, 4280627242, 4280693548,
      4280825134, 4280628269, 4280496683, 4280431401, 4280431401,
      4280366891, 4280301098, 4280170024, 4280170024, 4280170024,
      4280170024, 4280235817, 4280301357, 4280169775, 4280169268,
      4280300597, 4280431418, 4280628028, 4280824128, 4280824130,
      4280823618, 4280888900, 4280888900, 4280888900, 4280757314,
      4280494652, 4280428859, 4280363577, 4280429624, 4280298805,
      4280364596, 4280298805, 4280233012, 4280298036, 4280363829,
      4280429624, 4280561210, 4280627001, 4280627001, 4280561207,
      4280495414, 4280364850, 4280299057, 4280233774, 4280233775,
      4280430641, 4280430643, 4280627250, 4280758579, 4280823859,
      4281019955, 4281085750, 4281216568, 4281479227, 4281413436,
      4281347130, 4281281337, 4281347895, 4281611065, 4281874236,
      4282137408, 4281874749, 4281808957, 4282796881, 4284639085,
      ... 921500 more items
    

    Now, I was able to send this data through websockets to the browser, change it to Uint8Array, blob, and other types but nothing seems to convert this binary data to an actual image. For example my canvas after receiving this converted to Uint8Array type:
    forum2.JPG

    If I would add media type to my image data like that:

    obj.$image.connect((params) => console.log('received', params.data), 
    {mediaType: ['image/jpeg']});
    

    Then the return output is empty and maybe here I don't know what my next step should be.
    Sorry for the long post 🙂



  • It seems you have received the Image object and it was decoded successfully. There is something on the canvas as well, but it is hard to say what it is. Maybe you tried to draw a color image as RGB or vice versa? It looks like that there is at least a row alignment issue there.

    The easiest way to display an image on an HTML page is to use an <img> element:

    <img id="img-id">
    ...
    obj.$image.connect(image => image.toHtmlImage(document.getElementById('img-id')));
    


  • @Topi Hi Topi,
    Thank you very much for your quick reply! 🙂
    Indeed, I needed to adjust the width and height of my blank canvas to match the received data on the client-side. However, the image that I get it's somewhat different than I would expect.
    I would like to put my current setup to get the image into the browser and I believe it's far from "best-practice" so just things to get going.
    Server-side script running on nodejs:

    //ws_server.js
    const WebSocket = require('ws');
    const ws = new WebSocket.Server({ port: 8080 });
    const VisionAppster = require('visionappster');
    const channel_id = '/com.example.57c8bf60df644c40adf137bbc594fb27/1';
    const apiCtrl = new VisionAppster.RemoteObject('http://192.168.1.115:2015/apis' + channel_id);
    ws.on('connection', function connection(wsConnection) {
      wsConnection.on('message', function incoming(message) {
        console.log(`server received: ${message}`);
      });
      wsConnection.send('recieved message!');
        apiCtrl.connect().then(api => api.$image.connect((image) => {
              console.log('sent to client', new Uint8Array(image.data));
              // console.log('sent to client', image.data);
              // wsConnection.send(new Buffer.from(new Uint8Array(image.data)));
              wsConnection.send(new Uint8Array(image.data));
            }
        ));
    });
    

    Client-side script running in the browser:

    //ws_client.js
    const socket = new WebSocket('ws://localhost:8080'); 
    socket.binaryType = 'arraybuffer'; // if none Blob objects are used.
    socket.addEventListener('open', function (event) { 
      socket.send('init'); 
    }); 
    var getCanvasId = document.getElementById('myCanvas');
    var imageData = getCanvasId.getContext('2d').createImageData(1280, 780);
    var pixels = imageData.data;
    socket.addEventListener('message', function (event) { 
      var buffer = new Uint8Array(event.data);
      for (var i=0; i < pixels.length; i++) {
        pixels[i] = buffer[i];
      }
      console.log(buffer)
      getCanvasId.getContext('2d').putImageData(imageData, 0, 0);
    });
    
    socket.addEventListener('close', function (event) { 
      console.log('The connection has been closed'); 
    });
    

    Index.html

    <!DOCTYPE html>
    <html>
        <head>
            <meta charset="utf-8">
            <meta http-equiv="X-UA-Compatible" content="IE=edge">
            <title>websocket test</title>
            <meta name="description" content="">
            <meta name="viewport" content="width=device-width, initial-scale=1">
            <link rel="stylesheet" href="">
        </head>
        <body>
            <h1>Websocket test</h1>
            <div class="display">
                <h4>$image</h4>
                <canvas id="myCanvas" width="200" height="200"></canvas>
              </div>
            <script type="text/javascript" src="./ws_client.js"></script>
        </body>
    </html>
    

    Output from my web camera:
    canvas.png
    So, this should work out of the box if your vision app is running.
    Basically, my goal is to make a starter template for reading images and data to the client-side and vice-versa, but the first step is to get the image in a proper way. Again, sorry for the longs post 🙂



  • @luqt123 said in Properly display image source in the browser with JavaScript API:

    const apiCtrl = new VisionAppster.RemoteObject('http://192.168.1.115:2015/apis' + channel_id);

    maybe this one should be changed to

    const apiCtrl = new VisionAppster.RemoteObject('http://localhost:2015/apis' + channel_id);
    


  • I think that there is a template I need in - webapp.js.
    Well documented and very good starting point.



  • Sorry, but I don't quite get what you are trying to achieve. You only need server-side (e.g. Node.js) code if your application requires image processing in the back-end.

    Drawing pixels on the canvas in the browser requires careful handling of different RGB types and endianness. Please see the Image.toImageData() function. I'd however recommend using <img> instead. The cookbook recipe you are referring to uses <canvas> and toImageData(). Please let me know if that solves your problem.



  • Right, I'm a bit confused as well.
    At the moment I'm using canvas to draw pixel in and then display it. However I would like to use <img> tag but as you mentioned earlier obj.$image.connect(image => image.toHtmlImage(document.getElementById('img-id')));
    This would work in the web app, but since my index.html is not part of VA's webapp. It's just a plain webpage. If I try to import Visionappster.js into like this <script src="VisionAppster.js"></script><script type="text/javascript"> the browser would start to complain about the missing CORS (Reason: CORS header ‘Access-Control-Allow-Origin’ missing). But maybe this requires another post.
    So my other option was to do all the handling on the nodejs server-side. And to be fair in that case It's true I don't really need to stream all the signals in to my web page, but I wanted to see if so how to do it.
    Now the only way i was able to put the image into <img> tag is to draw all the incoming data (pixels) into canvas and then use

      canvasElement.toBlob(changeImage, 'image/jpeg');
      function changeImage(blob){
        const oldUrl = cameraImage.src;
        cameraImage.src = URL.createObjectURL(blob);
        URL.revokeObjectURL(oldUrl);
      }
    

    I was confused a bit if you define your mediaType: [' image/jpeg'] on the image signal it stops returning data, so I was thinking should it be that way.
    Then I checked what kind of data I can get with the compressed Image tool. And it might be that the tool is broken since it's stops the running app after maybe 10sseconds of running. But also this probably requires another post.


Log in to reply