An exclusive gaming industry community targeted
to, and designed for Professionals, Businesses
and Students in the sectors and industries
of Gaming, New Media and the Web, all closely
related with it's Business and Industry.
A Rich content driven service including articles,
contributed discussion, news, reviews, networking, downloads,
and debate.
We strive to cater for cultural influencers,
technology decision makers, early adopters and business leaders in the gaming industry.
A medium to share your or contribute your ideas,
experiences, questions and point of view or network
with other colleagues here at iVirtua Community.
From my point of view, I don't think CSS is a programming language.. then again my view goes for unix and apache scripting to. <_<
oo, and don't forget python. although that's an easy language, ooo.. and brainfuck.
Maybe there should be a poll for the easiest programming languages :P
And er... you wouldn't use binary to make programs :\ Certainly not in todays world, and certainly not in the 1990's.. the only time you would really use binary is for saving into file formats, and converting strings/numbers.
The lowest level you would use today is assembly, which these days you don't even need to use, and the only thing it would be used for is driver programming (yes, there are other uses.. but this is the year 2005!)
The lowest level you would use today is assembly, which these days you don't even need to use, and the only thing it would be used for is driver programming (yes, there are other uses.. but this is the year 2005!)
To be honest, I don't know JACK SQUAT about programming. But I do know that everything runs on 1's and 0's.
I figured no-one used binary themselves, rather had tools to manipulate it.
It just always amazes me how much you can fit in such a small amount of space. Even the data that is used to code this page and re-produce it is staggering if you think of it in 1's and 0's.
I don't understand how a program/CPU knows that 00000001 means 1 and not, say A, or an instruction to start or something. I know that that is pre-defined by data ahead of it... but it still amazes me.
The lowest level you would use today is assembly, which these days you don't even need to use, and the only thing it would be used for is driver programming (yes, there are other uses.. but this is the year 2005!)
To be honest, I don't know JACK SQUAT about programming. But I do know that everything runs on 1's and 0's.
I figured no-one used binary themselves, rather had tools to manipulate it.
It just always amazes me how much you can fit in such a small amount of space. Even the data that is used to code this page and re-produce it is staggering if you think of it in 1's and 0's.
I don't understand how a program/CPU knows that 00000001 means 1 and not, say A, or an instruction to start or something. I know that that is pre-defined by data ahead of it... but it still amazes me.
People do use binary, and it's not just for tools to manipulate it. It's not used much anymore, even for driver programming.. however, it's usefull if you do know it.. there are times when I have needed to use it.
A CPU just edits memory locations and values, and a CPU only understands binary numbers. For instance, the ASCII letter 'A' is 65 in decimal. 65 in binary is 1000001. Thus 'A' in binary would be 1000001. (It doesn't *quite* work like that.. but just to show an example.)
People do use binary, and it's not just for tools to manipulate it. It's not used much anymore, even for driver programming.. however, it's usefull if you do know it.. there are times when I have needed to use it.
Why in the world use Binary when there's many kinds of Programming languages? It seems like a complete waste of time to me.
People do use binary, and it's not just for tools to manipulate it. It's not used much anymore, even for driver programming.. however, it's usefull if you do know it.. there are times when I have needed to use it.
Why in the world use Binary when there's many kinds of Programming languages? It seems like a complete waste of time to me.
Because programming languages can't be used in those situations.. Infact, they don't exist.. there is no magical programming language that works on everything and can do everything. Binary is not *really* a programming language, it's a number system. It might be considered as machine code too.. like assembly.
Not only that but sometimes it's the best or only option to use, like when dealing with certain parts of a program or file format. Programming languages can't manipulate everything. And if you were using it in the context of what I just said, you would use it for mathmatical operations around the thing you wanted to change. Obviously you would not create whole binary programs. but if you were programming something like your own OS, interacting with Hardware, or dealing with values or file formats, then you would most likely to use binary at some point.
Last edited by kahrn on Thu Dec 29, 2005 4:37 am; edited 1 time in total
Machine Code is the hardest one out there. I don't know why your all saying C++, i'm learning it, slowly maybe, but it really isn't difficult, infact, I like making programs - even if they are useless :P
People do use binary, and it's not just for tools to manipulate it. It's not used much anymore, even for driver programming.. however, it's usefull if you do know it.. there are times when I have needed to use it.
Why in the world use Binary when there's many kinds of Programming languages? It seems like a complete waste of time to me.
In the end the data can only be handled by a CPU and any other part of a computer by an eletrical current. Where 1 is a high voltage and 0 is a low voltage.
People do use binary, and it's not just for tools to manipulate it. It's not used much anymore, even for driver programming.. however, it's usefull if you do know it.. there are times when I have needed to use it.
Why in the world use Binary when there's many kinds of Programming languages? It seems like a complete waste of time to me.
In the end the data can only be handled by a CPU and any other part of a computer by an eletrical current. Where 1 is a high voltage and 0 is a low voltage.
Yeah, I know. But, still, it seems like it'd be alot easier to use a language, which makes what you have to type smaller.