How to send an image(UIImage) from a SWIFT ios client program to a JAVA server program?

3k views Asked by At

My Question:

  • How to properly convert and send images from ios swift client app to a java server?( without using much of external sdks)

  • Which type of sockets to use in swift(I am new to swift and can't really find ANY suitable socket) ?

  • Please give me a example code as I am not at all well versed with swift syntax and libraries.

Expected Result from my program - The ios swift app should efficiently connect to my java server and send images of video frames live to it. The images should then be converted to bufferedImage on the Java server machine and played as video!

Regarding previously asked questions - I found only one similar question but the answer was not very informative.

Details

  • So, I have written a Java server program on my mac and I want to add a feature in which the user should be able to send live video feed from his iPhone(ios device) to my Java server program.

  • The ios app is written in swift on xcode.

  • In order to do that I capture CGImage from video frames in the swift program and convert it into UIImage ; then I convert this UIImage to byte[] data as follows:-

    let cgImage:CGImage = context.createCGImage(cameraImage, from: 
    cameraImage.extent)!     //cameraImage is grabbed from video frame
    image = UIImage.init(cgImage: cgImage)
    let data = UIImageJPEGRepresentation(image, 1.0)
    
  • This byte[] data is then sent to the IP address and port from where my Java Server is running using SwiftSocket/TCPClient (https://github.com/swiftsocket/SwiftSocket)

     client?.send(data: data!)
    
  • Here client is an object of type TCPClient which was declared in swift xcode like this:(https://github.com/swiftsocket/SwiftSocket/blob/master/Sources/TCPClient.swift)

        client = TCPClient(address: host, port: Int32(port))
        client?.connect(timeout: 10)
    
  • Connection is successful and the Java Server program spawns a MobileServer Thread to handle this client. DataInput and OutputStreams are opened with the ServerSocket. This is the run() method os the MobileServer Thread spawned by the Java server(where "in" is a DataInputStream derived from the ServerSocket)

     public void run()
     {
      try{
    
    
    while(!stop)
    {
      int count=-1;
    
      count = in.available();
      if(count>0) System.out.println("LENGTH="+count);
      byte[] arr=new byte[count];
      System.out.println("byte="+arr);
      in.read(arr);
    
      BufferedImage image=null;
      try{
           InputStream inn = new ByteArrayInputStream(arr);
           image = ImageIO.read(inn);
           inn.close();
         }catch(Exception f){ f.printStackTrace();}
         System.out.println("IMaGE="+image);
    
       if(image!=null)
       appendToFile(image);
    
    
      }
    
    
    
    
    
          }catch(Exception l){ l.printStackTrace(); }
    
      }
    
  • The problem is my Java server is receiving some strange byte sequences which are probably not properly convertable to BufferedImage and thus on viewing the "Video" stored in file I can only see a thin strip of "image" while the iPhone is capturing fine.(Basically the image is not properly transferred from the ios app to my server!) enter image description here

Entire Swift Program's viewController.swift for video capture is derived from this github project ( https://github.com/FlexMonkey/LiveCameraFiltering)

Edit - I have figured out the problem and posted it as an answer but this is still just a workaround because the server video feed still hangs a lot and I had to reduce the quality of the image byte data being sent by the swift client. There definitely is a better way to do things and I request people to share their knowledge.

1

There are 1 answers

1
Swapnil B On BEST ANSWER

So, I was not able to find a complete and absolutely perfect solution to the above mentioned problem but for the benefit of any other swift beginner who might stumble across a similar cross-language client-server program based problem , here are my two cents:

The first and foremost error in the above mentioned code is this line:

 let data = UIImageJPEGRepresentation(image, 1.0)

1) Here, I was encoding the UIImage to the highest possible quality by supplying the compression factor as 1. This , as I later examined was leading to the creation of byte array with counts exceeding 100000 and thus it was pretty difficult to easily and quickly send such a large data through the TCPClient socket.

2) Even if such a large array was efficiently sent by the TCPClient socket.It would be difficult for the Java DataInputStream at the server side to read the complete data at once. It was probably reading only small chunks of data at a time and therefore the image generated at the java server end was partial and fuzzy.

3) This line was another problem:

   count = in.available();
   if(count>0) System.out.println("LENGTH="+count);
   byte[] arr=new byte[count];
   System.out.println("byte="+arr);
   in.read(arr);

The in.available() method does probably not return the complete length of data which was sent by the client side. This lead to reading incomplete byte data and thus incomplete images.

The solution/workaround(kind of)

  • I reduced the compression factor to about 0.000005 in the UIImageJPEGRepresentation() method of the swift client and this lead to creation of byte array of lengths ~ 5000 (which was manageable)

  • To avoid the problem of reading incomplete data at the server side I converted the byte array to a base64String and then I simply added a termination character "%" at the end of this string which at the server side would mark the end of one base64 string.

  • I changed the DataInputStream/DataOutputStream at the server side to InputStreamReader/OutputStreamWriter since I was now dealing with characters/string.

  • The Java server's InputStreamReader would take in one character at a time and form a string out of it until it encountered the termination character "%" and then this base64string would be converted to byte array with:

    imageBytes=javax.xml.bind.DatatypeConverter.parseBase64Binary(str); 
    //str is a String formed by concatenating characters received by the InputStreamReader 
    
  • This imageBytes array is then converted into a BufferedImage and then drawn on a panel one after another thus reproducing the original iPhone live video

Modified Swift Code(ios client)

    func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!)
{


    var cameraImage: CIImage

    var image: UIImage ;
    let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
    cameraImage = CIImage(cvPixelBuffer: pixelBuffer!)
    let context:CIContext = CIContext.init(options: nil)
    let cgImage:CGImage = context.createCGImage(cameraImage, from: cameraImage.extent)!
    image = UIImage(cgImage: cgImage)
    DispatchQueue.main.async
        {



            self.imageView.image = image //live video captured from camera streamed to the device's own UIImageView


    }





    let thumbnail = resizeImage(image: image, targetSize: CGSize.init(width: 400, height: 400)) // snapshot image from camera resized 

    let data = UIImageJPEGRepresentation(thumbnail,0.000005) //the snapshot image converted into byte data
    let base64String = data!.base64EncodedString(options: Data.Base64EncodingOptions(rawValue: 0))  
    // byte image data is encoded to a base64String

    var encodeImg=base64String.addingPercentEncoding(withAllowedCharacters:  .urlQueryAllowed  )


    encodeImg = encodeImg! + String("%") // termination char % is added at the end
    var sendData = String("0%")
    if(live)
     {
    sendData = encodeImg!
     }
    client?.send(string: sendData!) //sent as a String using TCPClient socket



}

Modified Java Server Side run() method of the MobileServer Thread class

        public void run()
        {
          try{

               boolean access_granted=false;
               while(!stop)
               {

                 char chr=(char)in.read();


                 if(chr!='%')  // read and append char by char from the InputStreamReader "in" until it encounters a '%'
                 str+=Character.toString(chr);

                 else terminate=true;

                 if(terminate)

                 {


                  if(entry)
                  {
                    int a=str.indexOf('&');
                    int b=str.indexOf('#');
                    String username=str.substring(0,a);
                    String password=str.substring((a+1),b);
                    String ip=str.substring((b+1),str.length());
                    System.out.println("IP ADDRESS: \""+ ip+"\"");
                    String usernameA[]=convertToArray(username);
                    String passwordA[]=convertToArray(password);
                    String user=decrypt(usernameA,portt);
                    String pass=decrypt(passwordA,portt);


                    boolean accessGranted=false;
                    int response=dbManager.verify_clientLogin(user,pass);

                if(response==RegisterInfo.ACCESS_GRANTED) { 
                System.out.println("access granted"); 
                 accessGranted=true;
                }
                   int retInt=-1;
                   if(accessGranted) retInt=1;

                   out.write(retInt);

                   entry=false;
                   terminate=false;


                  }
                  else
                  {
                     terminate=false;



                     try {
                  // str includes the original single base64String produced by the swift client app which is converted back to a byte array                              

  imageBytes = javax.xml.bind.DatatypeConverter.parseBase64Binary(str); 
}catch(ArrayIndexOutOfBoundsException l){ exception=true; }
str="";
                      if(!exception)
                       {


         //this byte array image data is converted to a image and played on the videoPlayer, and serial images played would be visible as a video stream
                         vidPlayer.playImage(imageBytes);

                          ioexcep=false;


                        }


                        else exception=false;



                   }
             }






         }





         }catch(Exception l){ l.printStackTrace(); }















        }
  • This is the screenshot of the video being played: enter image description here

  • But as you can see, because the image was sent in a very inferior quality from the swift client : the video produced is of low quality as well.Plus the video still hangs in between..

  • I am sure there would be better methods to send a higher quality image data from the swift socket as evidenced by all those video chat apps in the market and I would be happy if anyone could throw some light on the advanced methods involved in order to achieve hd images transmission

  • One method could be buffering of byte data on the client and server side to transmit and play higher quality jpeg data