I am not sure where to start, I managed the code from a template. With the below code I can download all files from an Http-server side. It would check if this is already downloaded and if it is then it would not take it from the site. I want to only download part of the files. And I am trying to think of an easy solution to achieve one of the following points:
- Get the last Modified data or last created time on the Server-Http. I understand how to do this from a folder, but I don't want to download the file and then check this, I need to do this on the server. Onlocal pc would be as
FileInfo infoSource = new FileInfo(sourceDir);and theninfoSource.CreationTimewhere sourceDir is the file path. Something similar possible on http? - Get only the latest 10 files from the server site. No the latest, but latest 10.
- Monitor the server site so once there is a file MyFileName_Version put on the site, it would get the latest file with this naming convention.
Any of these ways would work for me, but I am still a newbie in these, so struggle here. Currently I have the following code:
using System;
using System.Diagnostics;
using System.IO;
using System.Net;
using System.Security.Cryptography.X509Certificates;
using System.Text.RegularExpressions;
using Topshelf;
namespace AutomaticUpgrades
{
class Program
{
static void Main(string[] args)
{
// This path containts of the Site, Then binary-release/, The
string url = "HTTP://LOCALHOUST:1000000";
DownloadDataFromArtifactory(url);
}
private static void DownloadDataFromArtifactory(string url)
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
using (StreamReader reader = new StreamReader(response.GetResponseStream()))
{
string html = reader.ReadToEnd();
Regex regex = new Regex(GetDirectoryListingRegexForUrl(url));
MatchCollection matches = regex.Matches(html);
if (matches.Count > 0)
{
WebClient webClient = new WebClient();
foreach (Match match in matches)
{
if (match.Success)
{
Console.WriteLine(match.Groups["name"]);
//"C:\\Users\\RLEBEDEVS\\Desktop\\sourceFolder\\Http-server Download"
if (match.Groups["name"].Length > 5
&& DupeFile(match.Groups["name"].ToString(),
"C:\\Users\\RLEBEDEVS\\Desktop\\sourceFolder\\Http-server Download")
)
{
webClient.DownloadFile("HTTP://LOCALHOUST:1000000" + match.Groups["name"], "C:\\Users\\RLEBEDEVS\\Desktop\\sourceFolder\\Http-server Download\\" + match.Groups["name"]);
}
}
}
webClient.Dispose();
}
}
}
}
public static string GetDirectoryListingRegexForUrl(string url)
{
if (url.Equals("HTTP://LOCALHOUST:1000000"))
{
return "<a href=\".*\">(?<name>.*)</a>";
}
throw new NotSupportedException();
}
private static bool DupeFile(string httpFile, string folderLocation)
{
string[] files = System.IO.Directory.GetFiles(folderLocation);
foreach (string s in files)
{
if (System.IO.Path.GetFileName(s).ToString() == httpFile)
{
return false;
}
}
return true;
}
}
}
After some days of getting into the 'HTTP-server mode', I got a valid solution for my question, hence posting it here. Moreover, I understood how API works and the question which I have asked, would not be fully clear, though you learn while you go.
My Conclusion: