C#/Theoretical- How do I figure out how long an event takes *between two computers*

59 views Asked by At

I'll try my best to explain the situation, but please ask any questions if I am not clear about it.

My situation: I have two computers, A and B. On computer A, I have a program that views a database and displays it to the user. On computer B, I have a program where various actions by the user cause data to ultimately end up in said database(very indirect process through many third party applications).

My 'goal': I need to figure out from the time I do an action on B, how long it will take for me to be able to access that data from the program on A. Ideally, I need to narrow this down to a 50ms margin of error.

Tools:

  • Computer A and B share a local connection
  • They can communicate via TCP-and I've used C# for this communication before. I know it isn't accurate to say, but we can effectively treat the communication times B->A and A->B as constant(but each direction is their own constant)
  • They have access to a shared NTFS-formated Network drive.
  • I can tell, to a negligible degree(for this purpose, O(n) effectively is zero), on B when the action is performed
  • I can tell, to a negligible degree(treat O(n) as zero) when a value is accessible from the program/database in A.

Constraints/Roadblocks

  • Computer B does not have internet access
  • The clocks cannot be trusted to be in sync-even if the time currently looks to be in sync. I've seen the clocks on both computers to be off from each-other by measures of minutes. I do not know how timestamps on the network drive are determined.
  • While the time in the TCP communication effectively is constant to me, I do not know what those constants (A->B, and B->A) are or how to measure them.
  • The actions and retrievals happen in third party programs, the database is a reflection of another database, and I cannot access the server these values are on.
  • (Edit/added) Computer B does not know when data lands in the database, only when the action to place it there fires off.

I think this ultimately turns in to some sort of math problem where I would somehow use differences in timestamps for messages within each computer, instead of comparing the time stamps between the computers. So more of measuring how long it takes for the TCP communication to work instead of looking at how long it takes for the process. If I had that, I could send a message from B to A when the event happens and a message from A to B when the data is available, and then subtract out the communication time. If this is my best option I don't know how I would set it up, practically or in theory, since I can't trust their own timestamps.

Also, while I would greatly appreciate some responses/answers/ideas- to be honest part of my writing this is for my benefit to clearly describe and think about the issue. Additionally, I couldn't find an answer to this problem so hopefully this will pop up if someone else has a similar problem down the road.

1

There are 1 answers

0
charly_b On

I couldnt get your actual environment from what you wrote but:

Basic requirements are:

  • Have a shared time server accessable by A and B.
  • Assign time to event/record produced by B.
  • Lookup time for event/record on A.

If one of that cant be supplied i guess there is no way to solve.

If the local computers time cant be synced accurate enough, you need to setup your own time server. I would:

Write a time server that runs on B and for example uses Win32::GetTickCount as current time.

Make the current time from B accessable for A through a tcp connection (or more accurate: serial connection).

Store the time in the record/event produced on B

When a new record/event is detected on A get time from record and get current time from B through the TCP connection.

Diff times and get result. (There is no big math in it)

You could also use the network drive as channel to transport current time but syncing the file acesss is a bit tricky i guess. TCP connection is easier to establish i guess.

Maybe using the "LastWriteTime" from the NAS could be another way to obtain a shared time. If everything works as i assume, you could just continuously write a file on B and get the LAstWriteTimeSTamp of the file on A. That could be your time server.

You may want to read about setting up a TCP connection between two computers or investigate in how the LAstWriteTimestamp of a file in the NAS is assigned / requrested.

Something like: On B: File.WriteAllText("\nas\timestamp.txt", Guid.NewGuid()); On A: aTimeStamp = new FileInfo("\nas\timestamp.txt").LastWriteTime Maybe LAstWriteTimeUTC.

You need to check if the LastWriteTime and the delay through the nas is accurate enough for you.

Hope that helps a bit.