I disabled TLS 1.0 and 1.1 like so:
[HKEY_LOCAL_MACHINE\SYSTEM...\SCHANNEL\Protocols\TLS 1.0\Client]
"DisabledByDefault"=dword:00000001
"Enabled"=dword:00000000
[HKEY_LOCAL_MACHINE\SYSTEM...\SCHANNEL\Protocols\TLS 1.0\Server]
"DisabledByDefault"=dword:00000001
"Enabled"=dword:00000000
[repeat for 1.1]
When I hit https://clienttest.ssllabs.com:8443/ssltest/viewMyClient.html with a browser, I get
TLS 1.3 No
TLS 1.2 Yes
TLS 1.1 No
TLS 1.0 No
When I hit the same endpoint from the same server with a .Net 4.7.2 console app that executes these lines
HttpClient c = new HttpClient();
var response =
c.GetAsync("https://clienttest.ssllabs.com:8443/ssltest/viewMyClient.html").Result;
I get
TLS 1.3 No
TLS 1.2 Yes*
TLS 1.1 Yes*
TLS 1.0 Yes*
Why is that and how can I restrict (or enable) protocols from outside the app? The underlying issue is actually that I cannot get .Net clients on this server to use TLS 1.2 on outbound connections even though it is enabled at the OS level, and understanding why I cannot disable other protocols might help me figure this out.
Ah, never mind, I missed the (*) footnote: