In my ongoing quest to become a better hipster, I stumbled upon this book, which is inarguably one of the greatest gifts the Internet has bestowed upon humanity thus far.
You are all so welcome.
Saturday, September 29, 2012
Thursday, September 27, 2012
Graceful Awesome Things
Sections 7.5-7.8 were essentially a survey of things that make code look less redundant than it might. Some take-aways:
- Short-circuit evaluation is a helpful thing.
- Conditional targeting in assignment statements is cool. All this stuff resembling C's ternary operator and Scala's if ([condition]) [return-value] else [return-value] syntax are nice, really. Stuff is made briefer without becoming any more difficult to understand.
- Compound and unary assignment operators are good for making us feel clever (probably unduly).
- --Also, using assignments as expressions (but entirely duly, since this is just that neat).
- Multiple assignment is pretty much wonderful.
One point of consternation: this stuff about bitwise operators in the C-based languages... Does this mean the operands themselves can't be Boolean? I'm assuming that's not the case, as such a thing wouldn't be terribly useful... Anyway, if we could go over this bit tomorrow, that'd be great. The antepenultimate paragraph of 7.6 was a little confusing.
Tuesday, September 25, 2012
A Pleasantly Lucid Batch of Sections
In terms of all this stuff about precedence and associativity, I'm very pro-APL. In my humble opinion, the parentheses approach actually makes expressions more readable across the board. Uglier, perhaps. But more readable, especially if we make the provision (a utopian one) that our programmers will be somewhat reasonable in their use of consecutive parentheses.
This chapter has a whole lot to say about the contention between those supporting increased flexibility in languages and those who would prefer to opt for better error-checking to make sure programmers don't make stupid mistakes. I guess I'd fall into the former camp, since it's really nice to be able to do some of the things that operator overloading and type coercion allow us to do. But I'll admit that my feelings will vary from one programming task to another, depending on how much I trust the programmers responsible and how crucial the program at hand is. I mean, if we take Dr. Stonedahl's example of Ada being used for computing that holds human lives in the balance... Well, yeah. I'll take extra protection over coding flexibility any day for a case like that.
Really, I just think that if people would stop making any mistakes at all, forever, things would be much easier for everyone.
Lazy, error-prone coders...
This chapter has a whole lot to say about the contention between those supporting increased flexibility in languages and those who would prefer to opt for better error-checking to make sure programmers don't make stupid mistakes. I guess I'd fall into the former camp, since it's really nice to be able to do some of the things that operator overloading and type coercion allow us to do. But I'll admit that my feelings will vary from one programming task to another, depending on how much I trust the programmers responsible and how crucial the program at hand is. I mean, if we take Dr. Stonedahl's example of Ada being used for computing that holds human lives in the balance... Well, yeah. I'll take extra protection over coding flexibility any day for a case like that.
Really, I just think that if people would stop making any mistakes at all, forever, things would be much easier for everyone.
Lazy, error-prone coders...
Wednesday, September 19, 2012
Getting Scared
I've always thought of myself as a pretty good reader. Not the kind to skim over things. I mean, if you quoted a sentence from a book I'd read recently, I'd probably be able to tell you the context and about where on the page it appeared. Lawl. Not so with this stuff.
I think what we have here is a case of Ty not having the necessary frames of reference to keep his information neatly organized and vaguely memorizable. It's such a relief to see any reference to Python or Java in the book, because I have some semblance of an ability to retain such things. When we start launching into COBOL, Fortran, C, F#... really anything but Python or Java... there's just no hope.
Some BASIC might even be welcome at this point, given our recent firsthand experience.
Is this just me? Anyone else having these kinds of issues? This isn't just cuz I haven't had 221, is it?
Anyways. Blergh. Doing my best.
I enjoyed learning about some of the data types (namely Heap-Dynamic Arrays, Associative Arrays, Records, Tuples, and Lists) that we'd become accustomed to using in our programming, and I feel like I could remember the stuff. ("Associative Arrays. Okay, that's like a HashMap in Java or a dictionary in Python. Sweet." Stuff like that helps. Frames of reference!)
Also, 6.11 was pretty great. I wish it had come a little sooner. But it helped a lot with understanding all this hooplah about the mysterious "heap." And pointers, for which I seem to have acquired, through mere proximity to the rest of you folks, the ubiquitous communal fear.
I think what we have here is a case of Ty not having the necessary frames of reference to keep his information neatly organized and vaguely memorizable. It's such a relief to see any reference to Python or Java in the book, because I have some semblance of an ability to retain such things. When we start launching into COBOL, Fortran, C, F#... really anything but Python or Java... there's just no hope.
Some BASIC might even be welcome at this point, given our recent firsthand experience.
Is this just me? Anyone else having these kinds of issues? This isn't just cuz I haven't had 221, is it?
Anyways. Blergh. Doing my best.
I enjoyed learning about some of the data types (namely Heap-Dynamic Arrays, Associative Arrays, Records, Tuples, and Lists) that we'd become accustomed to using in our programming, and I feel like I could remember the stuff. ("Associative Arrays. Okay, that's like a HashMap in Java or a dictionary in Python. Sweet." Stuff like that helps. Frames of reference!)
Also, 6.11 was pretty great. I wish it had come a little sooner. But it helped a lot with understanding all this hooplah about the mysterious "heap." And pointers, for which I seem to have acquired, through mere proximity to the rest of you folks, the ubiquitous communal fear.
Sunday, September 16, 2012
Some Ketchup
No post for long time!
This has to do with the readings from 5.1-5.4. More very soon, I promise. I know all you hordes of ravenous blog-followers have been going through some serious withdrawal in this past week. Here's one to tide you over for now.
I don't have a strong opinion on case-sensitivity, but I do think that the all-caps stylings of BASIC 'n' friends has a cool retro vibe. As for writability issues surrounding case-sensitivity... I'm not really hung up on this. IDE's like Eclipse can help with these little issues, for one thing. Maybe it's unnecessary complication, but I think we can all agree that the whole point of coding is to impress one's friends by doing complicated looking things. Right? So I can't really see a problem with this. But srsly, I don't think case-sensitivity is a big deal.
What is a big deal is the topic of binding. It is a very big deal. And it hurts my brain to think about it. I mean, the whole distinction between static and dynamic type binding is fine. We have frames of reference for that. But I just don't know that my uninitiated, pre-Computer Organization/Operating Systems/Compilers brain can wrap itself around all this Stack-Dynamic/Explicit Heap-Dynamic/Implicit Heap-Dynamic stuff. May have to read that section a couple dozen more times.
This has to do with the readings from 5.1-5.4. More very soon, I promise. I know all you hordes of ravenous blog-followers have been going through some serious withdrawal in this past week. Here's one to tide you over for now.
I don't have a strong opinion on case-sensitivity, but I do think that the all-caps stylings of BASIC 'n' friends has a cool retro vibe. As for writability issues surrounding case-sensitivity... I'm not really hung up on this. IDE's like Eclipse can help with these little issues, for one thing. Maybe it's unnecessary complication, but I think we can all agree that the whole point of coding is to impress one's friends by doing complicated looking things. Right? So I can't really see a problem with this. But srsly, I don't think case-sensitivity is a big deal.
What is a big deal is the topic of binding. It is a very big deal. And it hurts my brain to think about it. I mean, the whole distinction between static and dynamic type binding is fine. We have frames of reference for that. But I just don't know that my uninitiated, pre-Computer Organization/Operating Systems/Compilers brain can wrap itself around all this Stack-Dynamic/Explicit Heap-Dynamic/Implicit Heap-Dynamic stuff. May have to read that section a couple dozen more times.
Thursday, September 6, 2012
Syntax Parsing, BNF Grammar, Republicans, etc.
I thought it was fascinating to read about the sort of syntax parsing used for the analysis of programming languages. The language recognizers and generators briefly described in the chapter sound like they might also be effective tools in the realm of natural language processing. They'd just have to be incredibly complicated, is all. I'm mostly excited about this because of its potential relevance to AI systems that can understand and communicate through natural language, which is, of course, one more important step toward my conquering the galaxy through the military superiority of my cyborg armies and one day establishing supreme hegemonic rule over all of known space making the world a better place for everyone!
The BNF notation strikes me as a cool method for describing languages and pointing out differences between them in papers and such. It's particularly neat to see this in documentation for languages, example cases of which you can find right here. And also here! (What we're learning is helping us to learn other stuff more effectively O.o !?) I do wish people would listen to ISO every once in a while and go along with a standardized notation, though. Everyone using their own bastardized version of the same notation system just isn't helping anyone...
Not much else to say at the moment. Nice catch on the stuff about the various versions of ALGOL and their mysteriously and counter-intuitively differing levels of understandable metalanguage, Mary. I doubt I would have noticed that. It's a funny thing about humans, that we tend to see things as needing attention only when their failures are glaring us in the face. The metalanguage used to describe ALGOL 60 was perfectly lovely (warning: making assumption here), so less attention was paid to that aspect of language description when the folks were working on the new version of ALGOL, resulting in prohibitively abstruse documentation for ALGOL 68. On the other hand, if the documentation for ALGOL 60 had been inscrutable, the ALGOL 68 stuff would probably have been a lot clearer. When problems are current, we see them as needing to be fixed, so we fix them! When the problem has been fixed for a while, it's easy to forget that there was ever a problem in the first place and to fall back into making the same mistakes all over again. Incidentally, this phenomenon also serves to explain why Republican presidents are sometimes elected.
Wednesday, September 5, 2012
Sundries
The sheer volume of different programming languages being used today should be enough to intimidate many novice programmers (such as myself), especially considering that, in practice, one often finds that several languages are used in tandem to accomplish separate parts of a larger task. This makes me think that the programmers arguing so vociferously for a particular language probably aren't just defending their language and attacking others because they can. We can put it down to survival instinct. After all, if someone's comfortable in one language and really dreads the thought of learning a new one -- or if, like the guy in the Duck Punching video, you have to stop using a language you're familiar with because your manager thinks Ruby is the way of the future -- it makes some sense that one would lash out. None of us wants extra work. And none of us wants to feel like his/her livelihood is threatened by changing times.
Anyway, I'd been thinking about that. Goes with the thing from the last post about programmers not being immune to a sort of mythology. We'll move on to this week's stuff now.
To start: I'm glad this exists.
As for the book-learnin', I'm having a hard time retaining the info from the light-speed overview of these 20ish different languages (and their offshoots) we've looked at in ch. 2. I fully support Lauren's idea for creating a table to compare the languages, and I'll be more than happy to contribute to it myself. But honestly, I don't know that this stuff is going to stick in my mind without firsthand experience of each language. (And even then... we'll see. There's a lot of stuff!) The practice with BASIC is certainly helping, at least.
Ian makes a good point about many of the online Java vs. Python resources being one-sided. I'd think that this could be partly attributed to Java often being used in the workplace for its reliability and huge-ness. On account of its widespread use, Java doesn't really have to defend itself. Pythonistas may feel that they have to fight for their language to get its time in the limelight, because it's just not as big a name at this point. Of course, it's also possible that the sorts of people who code in their free time (a group which likely contains most of the people who feel inclined to write blogs about programming) just prefer using Python. Maybe for readability/writability reasons. Maybe because they're writing relatively small programs, for which Python is just fine (and sometimes even better suited than Java). Can't really say for sure why this phenomenon exists.
Except for the fact that Python is just better.
zing
Anyway, I'd been thinking about that. Goes with the thing from the last post about programmers not being immune to a sort of mythology. We'll move on to this week's stuff now.
To start: I'm glad this exists.
As for the book-learnin', I'm having a hard time retaining the info from the light-speed overview of these 20ish different languages (and their offshoots) we've looked at in ch. 2. I fully support Lauren's idea for creating a table to compare the languages, and I'll be more than happy to contribute to it myself. But honestly, I don't know that this stuff is going to stick in my mind without firsthand experience of each language. (And even then... we'll see. There's a lot of stuff!) The practice with BASIC is certainly helping, at least.
Ian makes a good point about many of the online Java vs. Python resources being one-sided. I'd think that this could be partly attributed to Java often being used in the workplace for its reliability and huge-ness. On account of its widespread use, Java doesn't really have to defend itself. Pythonistas may feel that they have to fight for their language to get its time in the limelight, because it's just not as big a name at this point. Of course, it's also possible that the sorts of people who code in their free time (a group which likely contains most of the people who feel inclined to write blogs about programming) just prefer using Python. Maybe for readability/writability reasons. Maybe because they're writing relatively small programs, for which Python is just fine (and sometimes even better suited than Java). Can't really say for sure why this phenomenon exists.
Except for the fact that Python is just better.
zing
Monday, September 3, 2012
Surveying Languages/The History Thereof
I've decided that I'm pretty psyched about looking into Lisp and Prolog, especially. AI's pretty exciting, after all. Don't think anyone's going to contest that. (Even the people who are afraid of robots taking over the world are still, technically, excited about it, right? Just not in the positive sense...)
It was surprising to see how similar some of the syntax rules of ALGOL and BASIC are to the ones we've already studied. Very convenient for us! Or maybe it's inconvenient...as they're all bound to start blending together pretty soon.
Anywho, I'm not going to go into the programming language history that we looked at in the textbook, as there are plenty of sources on the interweb that have dealt with the stuff much more expertly than I could. Here's a nice little graphical history that our professor shared with us, if you're interested.
It's cute how I'm acting like people other than those in the class are reading this, isn't it?
As a young computer scientist about to be unceremoniously tossed out into thefighting pits workplace, I don't find Mr. Brin's article bemoaning the lost age of students writing BASIC-level code to be especially encouraging. I'm assuming that he considers experience with higher-level languages to
be less helpful because you deal less directly with the inner workings
of the computer itself, which makes sense... It's good that our class is going to be looking at some BASIC this semester, then. Personally, I'm very happy about this prospect. Working with Python and Java so far, I've kind of felt like, "Oh, this is how it works because this is how it works," which doesn't feel like an altogether satisfactory explanation.
One last tidbit:
I want to draw attention to Lewis and Neumann's observation that "programmers, despite their professed appreciation of logical thought, are not immune to a kind of mythology." This is a beautiful sentence, for one thing. Whenever people invest themselves and their time in anything, they're bound to defend it against any would-be attackers. We're programmed this way. Let's just keep this in mind when Wednesday's debate rolls around.
It was surprising to see how similar some of the syntax rules of ALGOL and BASIC are to the ones we've already studied. Very convenient for us! Or maybe it's inconvenient...as they're all bound to start blending together pretty soon.
Anywho, I'm not going to go into the programming language history that we looked at in the textbook, as there are plenty of sources on the interweb that have dealt with the stuff much more expertly than I could. Here's a nice little graphical history that our professor shared with us, if you're interested.
It's cute how I'm acting like people other than those in the class are reading this, isn't it?
As a young computer scientist about to be unceremoniously tossed out into the
One last tidbit:
I want to draw attention to Lewis and Neumann's observation that "programmers, despite their professed appreciation of logical thought, are not immune to a kind of mythology." This is a beautiful sentence, for one thing. Whenever people invest themselves and their time in anything, they're bound to defend it against any would-be attackers. We're programmed this way. Let's just keep this in mind when Wednesday's debate rolls around.
Subscribe to:
Posts (Atom)
