Tuesday, November 27, 2012

Does A Bearded Computer Scientist Ever Have To Worry About Job Security?

This one's about Ruby again.

Wanted to talk a bit about exception handling in the language.  It looks/works like something in between Python and Java, but with some cool extra features.  Here's a short little intro:




In Ruby, the equivalent of Java's explicit throw is the explicit raise.
def intentional_raise
  puts 'This is intentional I swear.'
  raise 'Oh %&*#!'
  puts 'No but seriously I meant to do that.'
end
intentional_raise

Typing all of this into the interactive ruby shell (irb), we get:

This is intentional I swear. 
RuntimeError: Oh %&*#! 
        from (irb):3:in `intentional_raise' 
        from (irb):6 
        from C:/Ruby193/bin/irb:12:in `<main>'



But say we want recover from the exception in some way, and maybe retrieve some info from it. We can use rescue for this.
def kentucky 
  puts 'Well hi!' 
  raise 'I was raised in Kentucky.'  
rescue Exception => e  
  puts e.message   
end
kentucky

This time we simply get the message from the Exception, rather than having to see all those ugly details:

Well hi!
I was raised in Kentucky.



As one might hope, you can also catch specific kinds of exceptions. And heck, a single rescue block can catch more than one kind, if you'd like! (Also note that you don't have to do all this stuff inside a function. Cuz that would be stupid.)
begin
  raise NameError, 'Your name is so horrible that I just had to say something.'
  x = 1/0
rescue NameError, ZeroDivisionError
  print 'I either caught myself before I said that rude thing out loud '
  puts  'or saved the world from your negligence in trying to divide by zero.'
  puts
end

begin
  raise SecurityError, 'During the debate festivities, not all parts of the perimeter were guarded.'
rescue SecurityError
  print 'Everything turned out okay at the debate festival, '
  puts  'in spite of potential security issues.'
  puts
end

begin
  raise 'Come at me bro! Do you even lift?!'
rescue    # could also say rescue Exception [or rescue RuntimeError, in this case]
  print 'I don\'t have a plan for dealing with any particular exceptions, '
  print 'and I\'m especially afraid of that aggressive one that speaks in memes, ' 
  puts  'so I\'ll just re-raise anything I run into.'
  puts
  raise 
end

Here, we get (note that this one wasn't run from irb):

I either caught myself before I said that rude thing out loud or saved the world from your negligence in trying to divide by zero.

Everything turned out okay at the debate festival, in spite of potential security issues.

I don't have a plan for dealing with any particular exceptions, and I'm especially afraid of that aggressive one that speaks in memes, so I'll just re-raise anything I run into.

C:/Program Files (x86)/Notepad++/exceptionTest.rb:17:in `<main>': Come at me bro! Do you even lift?! (RuntimeError)



Monday, November 26, 2012

'Nother Post for Today

This one's more directly related to ch13.

Today in 300 (where we're also talking about threads), Dr. Oldham grumbled something to the effect of "If you're dealing with multiple threads, you're going to be looking at some performance gains."*  And I seem to remember from my course in Ireland that concurrency can result in more efficient execution overall, even on single-processor machines.**  But in what situation would this be the case?  I mean, it's obvious that you're going to get more efficient execution if you have multiple processors and the processes that are being dealt with take advantage of this.  It's similarly obvious that it is desirable to utilize concurrency for a wide variety of purposes to at least simulate multiple things happening at once (user interfaces, graphical video games, etc.).  But how does it make sense that a single processor would be able to complete execution more quickly if it divides its time between multiple tasks?

To use an analogy:  I have three different kinds of bread-loaves and a toaster with a single slot.  I cut the loaves into slices.  Why should it be more efficient to alternate between slices of different types of bread, rather than just proceeding from one loaf to the next in order?  Isn't switching between bread-types just extra overhead?  (In fact, for many situations, this overhead can get out of hand very quickly if a process uses more threads than there are processors on the machine.)

What's an example of a situation where we could see an improvement in performance thanks to multithreading on a single-processor machine?***


*He actually said nothing that even vaguely resembles this sentence, but it sure sounded like this is what he was getting at.

**It is conceivable that I am misremembering this.

***Please note that, as this page mentions, "the design of concurrent applications is a very complex process that is normally performed by people who make a ton of money, because it takes years of study to figure out how exactly a given task can benefit from concurrent execution."  So maybe it's okay if we don't completely understand this just yet. 

For Science

Turns out that distributed computing has provided us with a lovely way to donate computational resources to research for good causes!  Through IBM's World Community Grid and BOINC (Berkeley Open Infrastructure for Network Computing), computer users with Internet access can volunteer their system's unused cycles to help solve massively time-consuming problems.  The work is broken down into little chores, and separate systems handle these little chores simultaneously, meaning that the unfortunately limited funding for the research doesn't have to be used entirely on computational technology.  And the work gets done much faster!  

Find out more at the World Community Grid website and consider participating!

Thursday, November 15, 2012

Whoa there

I wanted to respond to Brooks' post from a couple days ago.  Rails has gotten a bad rap since Twitter switched to using Java/Scala.  This article, however contends that it isn't a problem of Ruby scaling poorly, but of the folks at Twitter using Rails incorrectly.  And this guy on stackoverflow also has a great (albeit somewhat old) comment on the stuff.

If you're curious about just how widely used some of these web frameworks are, check out this site.  PHP and ASP.NET dominate the field, of course.  It's interesting to note, though, that Ruby on Rails constitutes a negligible portion of the top million sites, but makes up a rather larger portion of the top 10,000.  (I'm not gonna say Rails was necessarily the key to their success or anything, but...)

One might also find it interesting that Amazon is one example of a site that makes use of Rails.

Sunday, November 4, 2012

Paul Graham Is A Delightful Writer

Exhibit A: Revenge of the Nerds.
Exhibit B: Achewood (in no way Paul Graham-related but delightful, nonetheless)

Did anyone else find it interesting that many of the languages Paul Graham mentions in his article seem to be those whose inefficiency is often considered as one of their primary disadvantages (Python, Ruby, Lisp, Smalltalk)?  I assume that this is at least in part due to the fact that these less-used languages can speed up the coding process itself by providing enhanced writability.  But I guess a lot of computing tasks don't really demand the efficiency of C/C++/Java to begin with.

In other news, C++ still scares me.

Monday, October 29, 2012

Substancelessness

Again, not much original thought to contribute here.  It's interesting to see how utterly different Lisp/Scheme/Racket are from anything else we've looked at so far.  Conveniently, it feels a lot more natural than I had anticipated.  Looks like the wider use of recursion is really the most potentially mind-boggling trait of all this functional stuff that the book has presented in this first bit of Ch. 15, at least.

In fact, the Lisp/Scheme/Racket stuff may be coming somewhat more easily precisely because it's so foreign.  We're going in with an open mind, y'know?  Related to this idea, Brooks had a nice observation about the issue of thinking in terms of a certain language when you're coding. I wonder if this very problem might have contributed to making Scala seem so difficult.  After all, so much of the syntax was like Java.  And we were working in Eclipse, for which Java programming is the only frame of reference that many of us have.  Maybe that familiarity was more a drawback than an advantage.

Thursday, October 25, 2012

The Blub Paradox

This is a pretty fantastic little article, right here.  It seems that it is now time to determine which language is the most powerful and use it exclusively until something better is created.  Is it still Lisp?  Or is it, perhaps... Ruby?  :O  We should probably figure this out once and for all during tomorrow's class.  

Not a lot to talk about this time.  Just very excited about getting into Lisp and whatnot.  (And to check out everyone's games tomorrow!)

Wednesday, October 24, 2012

Apparently This Is The Blog Where We Brag About How Far Along In The Ada Game We Are

We're frickin' done.

Just a little more commenting and then the README to go.

Here's what a certain famous critic said after we let him try the beta:
"This revolutionary artistic foray into the realm of imagination and beauty will touch players emotionally, moving them from tears of delight, to tears of nostalgia, to tears of disbelief, to tears of frustration, to tears of despair, to tears of not-wanting-to-go-on-anymore, to tears of just-being-finished-with-this-nonsense-and-instead-playing-a-game-that-doesn't-kill-them-when-they-try-to-pick-up-some-random-item-because-how-are-you-supposed-to-anticipate-that-sort-of-thing-honestly-?."
      -Roger Ebert

Bring your tear-rags, folks.


So anyway.

I figured that it's a good time to talk about Objective-C, because I'm pretty sure this is the first time we've seen it in the text.

It's a little baffling to me that the Apple folks would choose to have the iPhone programmed in this formerly somewhat obscure language, instead of something that was... actually used by anyone.  I mean, we saw on the TIOBE index that the language was virtually unused prior to its designation as the language to be used for writing iPhone apps.  So I wonder: Why did they choose to use it?  What makes Objective-C stand out?  Were the language designers just really wealthy and sad that no one was using the language they made, so they paid Apple to make it popular?

Google's decision to use Java for the Android is easier to understand.  Being so widely used in the first place, programmers would face a gentler learning curve in beginning to write apps for distribution on the Android marketplace.  And when you're trying to populate a market with user-created products that extend the capabilities of a device you're selling, this reliance upon knowledge of a very widely used language makes a lot more sense than demanding that folks learn something new.

But hey.  It worked out for Apple.  Objective-C is currently #3 on the TIOBE index, just after Java and C.  Craziness.

Sunday, October 14, 2012

Those Times When You Hurt The Formal Parameters' Feelings Because You Made Them Sound Like They Weren't Even Real

I'm still a little flabbergasted by how nicely these sections on subprograms and functions programmers coincided with my most recent question on the Q&A.  The powers that be do not want me confused, even for a second.

Most of this business was pretty straightforward, I thought.  My eyes did glaze over a bit during 9.5.6 (Multidimensional Arrays as Parameters), but I'm not actually sure if that's important enough for us to go over in class.

It would be great, though, if we could talk about the pass-by-name method!  Not sure I've caught that yet, even after seeing it a bit in Scala.  And I sense it's going to be relevant when we look at the functional paradigm, when that happens.

Python has a cool thing going on with its def statements being executable.  With certain functions (if they're brief enough that this wouldn't impair readability), seems like it could be nice to have their bodies be conditional (as in the way the book demonstrates at the top of p. 390).

Keyword parameters seems like a pretty epic thing to me, so I don't know what you're on about with that, Brooks.  Sure helps readability when you've got all kinds of parameters flying around.  I do agree with you on the matter of Ruby's parameter system being complex, though.  It's just flat out scary, really...  Fortunately, I've been able to find absolute boatloads of hipsters talking about this stuff (and hopefully making it clearer) here, here, here, and here.

Personally, I would have liked it if the book had referred to its actual parameters and formal parameters as arguments and parameters, respectively.  Or would that have been less precise?  Are they completely interchangeable?

Sunday, October 7, 2012

Ruby Is Better Than Everything

It was bound to happen.  I was an easy target, after all.  In any case, I expect the flying seraphic yaks of the sky-hipsters will be along to whisk me over the mountains to Portland any moment now.

In the meantime: moar tasty blag.

I've always been a little put off by the fact that posttest and pretest loops are so named.  Call me crazy, but if a loop were really posttest, wouldn't it come... after the test?  And before the test for pretest, right?  Makes sense to me, at least.

Here I join my voice to the communal "Whut?" regarding Python's inclusion of an optional else clause with its for loops that's been initiated by Cara over here.  This is a singularly useful abomination, though, and I could honestly see myself writing one of these in the future.  Readability be damned (not actually, but I don't see how it interferes with readability so much).

Why is it that so many languages, including Python, have constructs like range(x) return all the numbers up to but excluding x?  Strikes me as a little counter-intuitive.  I'm sure they have their reasons, though...

And one last thing:
It sure is neat that we can use something like

for (ptr = root; ptr!=null; ptr = traverse(ptr)) {
//tentatively fixed, because I think the book
//may have erroneously printed ptr==null for it's control expression
...
}

to traverse our data structures.  I didn't realize this before 8.3, and it's a swell thing.

Last Wednesday's Blog...Today!!1

Like Brooks, I'd taken for granted the fact that we have all these lovely control structures to account for the variability of contexts/user input for our programs.  8.1-8.2 opened my eyes to how convenient multiple selection structures really are (and to how silly Java was to forbid the use of switches on Strings before SE 7... Probably should have brought that up during the debate...).

(I have taken the liberty of translating the above into the mother tongue of the inexplicably Russian majority of my readers.  Courtesy of Google Translate, here it is, for your convenience: )
Как Брукс, я бы само собой разумеющимся тот факт, что у нас есть все эти прекрасные структуры управления для учета изменчивости контекстах / вход для пользователей наших программ. 8.1-8.2 открыл мне глаза, как удобно нескольких структур Выбор на самом деле (и как глупо Java был запретить использование переключателей на строк до Ю-В 7 ... Наверное, должны были принести, что во время обсуждения ...).

Nay, but srsly brethren!

Thanks be to goodness for whatever obscenely bearded technomancers came up with and put together all these nifty little things we bath-salts'd poop-flinging monkeys get to throw around so indolently.  Hug a nerd today.

Saturday, September 29, 2012

The World Is A Strange, Beautiful Place

In my ongoing quest to become a better hipster, I stumbled upon this book, which is inarguably one of the greatest gifts the Internet has bestowed upon humanity thus far.

You are all so welcome.

Thursday, September 27, 2012

Graceful Awesome Things

Sections 7.5-7.8 were essentially a survey of things that make code look less redundant than it might.  Some take-aways:

  • Short-circuit evaluation is a helpful thing.
  • Conditional targeting in assignment statements is cool.  All this stuff resembling C's ternary operator and Scala's  if ([condition]) [return-value] else [return-value]  syntax are nice, really.  Stuff is made briefer without becoming any more difficult to understand.
  • Compound and unary assignment operators are good for making us feel clever (probably unduly).
  • --Also, using assignments as expressions (but entirely duly, since this is just that neat).
  • Multiple assignment is pretty much wonderful.
One point of consternation: this stuff about bitwise operators in the C-based languages...  Does this mean the operands themselves can't be Boolean?  I'm assuming that's not the case, as such a thing wouldn't be terribly useful... Anyway, if we could go over this bit tomorrow, that'd be great.  The antepenultimate paragraph of 7.6 was a little confusing.

Tuesday, September 25, 2012

A Pleasantly Lucid Batch of Sections

In terms of all this stuff about precedence and associativity, I'm very pro-APL.  In my humble opinion, the parentheses approach actually makes expressions more readable across the board.  Uglier, perhaps.  But more readable, especially if we make the provision (a utopian one) that our programmers will be somewhat reasonable in their use of consecutive parentheses.

This chapter has a whole lot to say about the contention between those supporting increased flexibility in languages and those who would prefer to opt for better error-checking to make sure programmers don't make stupid mistakes.  I guess I'd fall into the former camp, since it's really nice to be able to do some of the things that operator overloading and type coercion allow us to do.  But I'll admit that my feelings will vary from one programming task to another, depending on how much I trust the programmers responsible and how crucial the program at hand is.  I mean, if we take Dr. Stonedahl's example of Ada being used for computing that holds human lives in the balance...  Well, yeah.  I'll take extra protection over coding flexibility any day for a case like that.

Really, I just think that if people would stop making any mistakes at all, forever, things would be much easier for everyone.

Lazy, error-prone coders...

Wednesday, September 19, 2012

Getting Scared

I've always thought of myself as a pretty good reader.  Not the kind to skim over things.  I mean, if you quoted a sentence from a book I'd read recently, I'd probably be able to tell you the context and about where on the page it appeared.  Lawl. Not so with this stuff.

I think what we have here is a case of Ty not having the necessary frames of reference to keep his information neatly organized and vaguely memorizable.  It's such a relief to see any reference to Python or Java in the book, because I have some semblance of an ability to retain such things.  When we start launching into COBOL, Fortran, C, F#... really anything but Python or Java... there's just no hope. 

Some BASIC might even be welcome at this point, given our recent firsthand experience.

Is this just me?  Anyone else having these kinds of issues?  This isn't just cuz I haven't had 221, is it?

Anyways.  Blergh.  Doing my best.

I enjoyed learning about some of the data types (namely Heap-Dynamic Arrays, Associative Arrays, Records, Tuples, and Lists) that we'd become accustomed to using in our programming, and I feel like I could remember the stuff.  ("Associative Arrays.  Okay, that's like a HashMap in Java or a dictionary in Python.  Sweet."  Stuff like that helps.  Frames of reference!)

Also, 6.11 was pretty great.  I wish it had come a little sooner.  But it helped a lot with understanding all this hooplah about the mysterious "heap."  And pointers, for which I seem to have acquired, through mere proximity to the rest of you folks, the ubiquitous communal fear. 

Sunday, September 16, 2012

Some Ketchup

No post for long time!

This has to do with the readings from 5.1-5.4.  More very soon, I promise.  I know all you hordes of ravenous blog-followers have been going through some serious withdrawal in this past week.  Here's one to tide you over for now.

I don't have a strong opinion on case-sensitivity, but I do think that the all-caps stylings of BASIC 'n' friends has a cool retro vibe.  As for writability issues surrounding case-sensitivity... I'm not really hung up on this.  IDE's like Eclipse can help with these little issues, for one thing.  Maybe it's unnecessary complication, but I think we can all agree that the whole point of coding is to impress one's friends by doing complicated looking things.  Right?  So I can't really see a problem with this.  But srsly, I don't think case-sensitivity is a big deal.

What is a big deal is the topic of binding.  It is a very big deal.  And it hurts my brain to think about it.  I mean, the whole distinction between static and dynamic type binding is fine.  We have frames of reference for that.  But I just don't know that my uninitiated, pre-Computer Organization/Operating Systems/Compilers brain can wrap itself around all this Stack-Dynamic/Explicit Heap-Dynamic/Implicit Heap-Dynamic stuff.  May have to read that section a couple dozen more times.

Thursday, September 6, 2012

Syntax Parsing, BNF Grammar, Republicans, etc.

I thought it was fascinating to read about the sort of syntax parsing used for the analysis of programming languages. The language recognizers and generators briefly described in the chapter sound like they might also be effective tools in the realm of natural language processing. They'd just have to be incredibly complicated, is all.  I'm mostly excited about this because of its potential relevance to AI systems that can understand and communicate through natural language, which is, of course, one more important step toward my conquering the galaxy through the military superiority of my cyborg armies and one day establishing supreme hegemonic rule over all of known space making the world a better place for everyone!

The BNF notation strikes me as a cool method for describing languages and pointing out differences between them in papers and such.  It's particularly neat to see this in documentation for languages, example cases of which you can find right here.  And also here! (What we're learning is helping us to learn other stuff more effectively O.o !?) I do wish people would listen to ISO every once in a while and go along with a standardized notation, though.  Everyone using their own bastardized version of the same notation system just isn't helping anyone...  

Not much else to say at the moment.  Nice catch on the stuff about the various versions of ALGOL and their mysteriously and counter-intuitively differing levels of understandable metalanguage, Mary.  I doubt I would have noticed that.  It's a funny thing about humans, that we tend to see things as needing attention only when their failures are glaring us in the face.  The metalanguage used to describe ALGOL 60 was perfectly lovely (warning: making assumption here), so less attention was paid to that aspect of language description when the folks were working on the new version of ALGOL, resulting in prohibitively abstruse documentation for ALGOL 68.  On the other hand, if the documentation for ALGOL 60 had been inscrutable, the ALGOL 68 stuff would probably have been a lot clearer.  When problems are current, we see them as needing to be fixed, so we fix them!  When the problem has been fixed for a while, it's easy to forget that there was ever a problem in the first place and to fall back into making the same mistakes all over again.  Incidentally, this phenomenon also serves to explain why Republican presidents are sometimes elected.

Wednesday, September 5, 2012

Sundries

The sheer volume of different programming languages being used today should be enough to intimidate many novice programmers (such as myself), especially considering that, in practice, one often finds that several languages are used in tandem to accomplish separate parts of a larger task.  This makes me think that the programmers arguing so vociferously for a particular language probably aren't just defending their language and attacking others because they can.  We can put it down to survival instinct.  After all, if someone's comfortable in one language and really dreads the thought of learning a new one -- or if, like the guy in the Duck Punching video, you have to stop using a language you're familiar with because your manager thinks Ruby is the way of the future -- it makes some sense that one would lash out.  None of us wants extra work.  And none of us wants to feel like his/her livelihood is threatened by changing times.

Anyway, I'd been thinking about that.  Goes with the thing from the last post about programmers not being immune to a sort of mythology.  We'll move on to this week's stuff now.

To start: I'm glad this exists.

As for the book-learnin', I'm having a hard time retaining the info from the light-speed overview of these 20ish different languages (and their offshoots) we've looked at in ch. 2.  I fully support Lauren's idea for creating a table to compare the languages, and I'll be more than happy to contribute to it myself.  But honestly, I don't know that this stuff is going to stick in my mind without firsthand experience of each language.  (And even then... we'll see.  There's a lot of stuff!)  The practice with BASIC is certainly helping, at least.

Ian makes a good point about many of the online Java vs. Python resources being one-sided.  I'd think that this could be partly attributed to Java often being used in the workplace for its reliability and huge-ness.  On account of its widespread use, Java doesn't really have to defend itself.  Pythonistas may feel that they have to fight for their language to get its time in the limelight, because it's just not as big a name at this point. Of course, it's also possible that the sorts of people who code in their free time (a group which likely contains most of the people who feel inclined to write blogs about programming) just prefer using Python.  Maybe for readability/writability reasons.  Maybe because they're writing relatively small programs, for which Python is just fine (and sometimes even better suited than Java).  Can't really say for sure why this phenomenon exists.

Except for the fact that Python is just better.

zing

Monday, September 3, 2012

Surveying Languages/The History Thereof

I've decided that I'm pretty psyched about looking into Lisp and Prolog, especially.  AI's pretty exciting, after all.  Don't think anyone's going to contest that.  (Even the people who are afraid of robots taking over the world are still, technically, excited about it, right?  Just not in the positive sense...)

It was surprising to see how similar some of the syntax rules of ALGOL and BASIC are to the ones we've already studied.  Very convenient for us!  Or maybe it's inconvenient...as they're all bound to start blending together pretty soon.

Anywho, I'm not going to go into the programming language history that we looked at in the textbook, as there are plenty of sources on the interweb that have dealt with the stuff much more expertly than I could. Here's a nice little graphical history that our professor shared with us, if you're interested. 

It's cute how I'm acting like people other than those in the class are reading this, isn't it?

As a young computer scientist about to be unceremoniously tossed out into the fighting pits workplace, I don't find Mr. Brin's article bemoaning the lost age of students writing BASIC-level code to be especially encouraging.  I'm assuming that he considers experience with higher-level languages to be less helpful because you deal less directly with the inner workings of the computer itself, which makes sense...  It's good that our class is going to be looking at some BASIC this semester, then.  Personally, I'm very happy about this prospect.  Working with Python and Java so far, I've kind of felt like, "Oh, this is how it works because this is how it works," which doesn't feel like an altogether satisfactory explanation. 

One last tidbit:
I want to draw attention to Lewis and Neumann's observation that "programmers, despite their professed appreciation of logical thought, are not immune to a kind of mythology."  This is a beautiful sentence, for one thing.  Whenever people invest themselves and their time in anything, they're bound to defend it against any would-be attackers.  We're programmed this way.  Let's just keep this in mind when Wednesday's debate rolls around.

Tuesday, August 28, 2012

teh first post

Welcome! 

Selecting the visual theme "Awesome Inc." for this blog seemed like a pretty obvious and practical decision.  (But actually! Your computer doesn't have to use as much energy to display dark backgrounds.  Hence: blackle!)

First, a short and heartfelt thank you to Ben Olmstead, Dante Alighieri, and the Italian language for helping to direct the course of the universe in such a way that the word "malbolge" has come to be the name of something nearly as horrific as the super-ebola-zombie-plague that the word calls to mind.  Apparently there's some dispute over whether it's entirely Turing-complete, but hey, at least we're on the right track.

Actual relevant stuff:
The first few sections of Sebesta's Concepts of Programming Languages have made me think that it might not be impossible for us humans to create an "ideal" programming language, the likes of which we discussed on Monday.  But, given that an ideal language would supposedly be ideally suited for every task, that the creator(s) would have to take into account problems where the immensity of such a flexible language could easily impede readability (beginning of 1.3.1.1 brings up this problem of large languages), and that new tasks for languages arise constantly as technology develops.... it could be a while.


In the meantime, here's a picture of a kitten.  (This is what blogs are for, right?!)