I am trying to program a very simple Mobile Application (J2ME) in java. The idea is to access a website via URL input and read the contents of the website into a buffer.
Here's the problem. This works perfectly fine for some URL's but not others? The example below (wikipedia) works fine. But take "http://java.com/en/about/" as an example and the "HttpConnection hc" returns -1 for getLenght() so there is no content to read into the buffer?
Here's my code:
String url = "http://en.wikipedia.org/wiki/RSS";
//Sets up HttpConnection and InputStream using the URL variable
HttpConnection hc = null;
InputStream is = null;
try {
hc = (HttpConnection) Connector.open(url);
is = hc.openInputStream();
} catch (IOException ie) {
System.out.println(ie.getMessage());
}
//Reader object created to read input from InputStream
Reader rdr = new InputStreamReader(is);
//Variable "content" will store HTML code
String content = "";
//Get the lenght of the data to set the buffer sizes
int len = (int) hc.getLength();
Any ideas? let me know if I've missed anything out!
Just for info I am using Netbeans 6.9.1
Library for HttpConnection is "javax.microedition.io.HttpConnection;" + "import javax.microedition.io.Connector;"
The HTTP response from java.com is
The HTTP response from wikipedia is
As you see, the HTTP response of http://java.com/en/about/ doesn't contain Content-Length header, the content is chunked.
So, the getLength() return -1.