Why would you use '"%d", variable' when you can simply type the variable name?

152 views Asked by At

I was wondering why my professor would type

System.out.println("%d", myRank);

instead of

System.out.println(myRank);

from my point of view the latter is inherently more efficient and has the same effect

I've tried both and they perform the same, so I am quite confused as to why the former is used at all

3

There are 3 answers

5
Tom On BEST ANSWER

First of all, System.out.println("%d", myRank); will not compile.

You can use System.out.printf or String.format(...) inside printf

For a single argument, as you mentioned, it does not make much sense. The value of String.format (or printf) can be seen when you have string concatenation and it makes the code more readable and easier to maintain.

For example, you can print the following message:

System.out.printf("my rank is %d, which is nice. and my friend's rank is %d. which is lower than mine", myRank, friendRank);

Instead of:

System.out.printf("my rank is " + myRank + ", which is nice. and my friend's rank is " + friendRank + ". which is lower than mine");
0
Mureinik On

I'm assuming that you meant to use printf in the first snippet and not println, because otherwise it just won't compile.

Assuming that you did, the first snippet really isn't very useful, as you noted. The main point of format strings is to be able to easily interleve variables in to them:

String dayName = "Friday";
int day = 13;
System.out.printf("Today is %s the %dth", dayName, day);
0
rzwitserloot On

Consistency with the present and consistency with the future.

Whether you agree that it is wise to do this, is one of those endless 'opinions about the right way to write code' debates. In the end, writing code is a craft - and thus, opinions differ on how what excellence in the craft looks like (So, yes, 'software engineer' as a term is very badly incorrect, instead 'software artisan' is a lot closer to how software development is actually done. One can lament that or celebrate it - that's beyond this question and probably beyond what SO is for. I'm merely stating how things are, not how they should be).

But first...

Why printf over println at all?

There are a number of reasons to use printf over println. The obvious one, is that you want the output string to be a template, i.e. whenever it gets more complex than literally "%d" or "%f", i.e. you end up with ("Hello, %s, welcome to %s! There are %d before you in line", userName, serverName, queueSize), it's obvious printf is superior in this regard.

But it's not just about that in the present - after all, in your code, that's simply not at all the case. All that is currently needed is, evidently, "%d".

A second reason is if you need to format a thing. For example, while "%d" and "%s" rarely change anything, "%f" almost always does: You really should never print a double or float value, you must control the process of converting value (which tends to have small rounding errors in it. For example, 0.1 does not exist. Seriously, it doesn't. For the same reason 1/3 does not exist in decimal (try it, write it down on a piece of paper), 0.1 doesn't exist as computers count in binary and anything whose divisor doesn't consist solely of factors of 2, doesn't exist). You can still print it fine but that's because System.out.println applies a very light bit of rounding. More usually you want better control over this, hence, this:

System.out.printf("%.6f", someDoubleValue);

is definitely useful, nay, required, in contrast to System.out.print(someDoubleValue).

The same logic applies to %20s for example. But, this, too, does not currently apply to your situation.

The future is a moving target

Most software isn't released-and-done. This is 'weird' - if a carpenter produces a table they tend to sell it and move on to another project. Software isn't really like that - the vast majority of this is modified almost continuously throughout its lifetime. Hence, ability to be easily modified is an important property of a software project: Projects whose code doesn't lend itself well to updates, is bad code, for the simple reason that it's going to happen sooner rather than later, and thus, the code that is better 'prepared'. It's like a cupboard you intend to disassemble and move later: Then you should buy a cupboard that is screwed together with good screws, instead of one with screws you're going to strip during assembly, or, even worse, a cupboard that you have to glue together.

Virtually all software is like 'a cupboard you will have to disassemble a lot' - there is very little market for throwaway software that you build, use, and discard entirely.

Craftmanship

This gets us to craftmanship: It's a useful skill to be able to build software. Specifically, in a world where almost everybody wants disassemblable cupboards, it's not useful to learn the skill of efficiently glueing cupboards together. It's much more useful to be really good at building screwed-together ones.

If learning the craft is really difficult to do, it makes sense to optimize for the useful, common tasks, especially if those serve well enough for alternate, lesser requirements. Hence, good programmers optimize for writing good code, even if there is no foreseeable need for e.g. code that can easily be modified later. For the same reason someone tasked to build a cupboard that will be used and discarded, might still use screws, because they aren't expensive and they know how to build cupboards that way, so, why not?

But.. inefficient!

Ludicrous. You will not be able to measure any performance difference here. It's a fair argument if the job requires "good performance" based on something specific (just 'it needs to perform well' is far too nebulous a requirement), and it's a fair argument even if the job does not require that now but it is foreseeable that in the future this code will grow some requirements (i.e. needs to be modified), and one of these requirements is 'it needs to perform well'.

But, any objective measure you care to come up with that is at all related to real life requirements will not disqualify this code for being 'too inefficient', hence, worrying about that is like worrying about an errant pencil mark on a side of wood paneling that will be invisible once a cupboard is built. It's actively silly to worry about the irrelevant; programming is hard enough without making up rules that neither requirements nor the general need to write good software actually requires you to adhere to.