2030?
I hardly think it will matter by then, all coders to whom the very idea of what a CPU actually is let alone what instructions it executes will be long gone.
The future is pretty grim to old school coders like myself. The science and art of real coding will be the domain of precious few, those people whose job it is to implement the back end of high level compiler systems and the like.
Everything else that actually still requires any human input by then will probably be written in ultra dipsh*t high level RAD tools. No doubt people will call whatever flavour of the month XML-like system that is in use by then "code" and "coding" will be using it to describe some behaviour you want to implement. In the worst case scenario, this will even impress people by then.