As part of my Cryptography Module's CW, I need to use the Extended Euclidean Algorithm to calculate the Greatest Common Denominator, GCD, of two BigIntegers. I've done this, it's easy and I know my code works.
However, there's a further specification that:
Compute the Extended Euclidian algorithm for the following inputs and provide (d, s, t) in decimal system format in a text file with one line per number.
I'm assuming they're asking me to write a file with 3 lines, a line representing each of the outputs converted from an integer representation to a base 10 representation, ie:
(1 x 100,000 (10^5)) + (2 x 10,000 (10^4)) + (6 x 1000 (10^3)) + (3 x 100 (10^2)) + (2 x 10 (10^1)) + (5 x 1) = 126,325
Is my interpretation correct? What's the best way to go about doing this?