Archive for the ‘Software’ Category

Is the Apple core starting to rot?

In Software, Technology on March 17, 2012 at 7:18 pm

After restarting my MacBook Pro for the third time today because the screen didn’t turn on when I opened the cover after shutting it for a while, I’m beginning to wonder whether Apple has jumped the shark, so to speak. This isn’t the only “wart” that has begun to show up since I installed OS X Lion last week during term break. There was the time that I shut the cover and the processor didn’t go to sleep. By the time I got home and took the fan was humming and the processor temperature was up around 180°F.

Installing Lion was simple. Download it from the App Store and in about an hour the system was up and running. Now, Apple has always been the epitome of systems that are intuitive and just do what you expect of them. But somethings just don’t feel quite right or work as well as one would expect. None of these are big things, but the little things that start to go wrong can often be harbingers of bigger things to come.

First, when I got to school I had to re-approve the use of the wireless certificate. That’s the only certificate that I have had this happen with. I’m not sure why.

Next, I had to set up my desktop background photo selection and the rate of change several times before it “took.” When I say that it “took” I mean that some of the desktops that I use would suddenly show a background from the Apple set of images rather than the APerture album that I had selected. And, no matter what I did short of deleting the desktop, I couldn’t get it to change back until I rebooted the system.

One of my colleagues at work has a MacBook that he’s starting to use to do some development. He has Lion installed on it. We had to install Xcode on it so he could get the C compiler needed to install Ruby. This used to be pretty simple. Now, installing Xcode doesn’t seem to be sufficient. You have to also go through a few gyrations to actually install the command line tools. No big deal, but once again annoying.

The system just doesn’t feel quite right. There are just these small things that make me wonder whether the care about details has lessened since Steve Jobs isn’t there to browbeat the company into submission over the details. One final example is the look of the Finder sidebar. There used to be a little bit of color applied to the special folders, like Applications and Downloads. Now, the sidebar is color-free. Searching the Web located several articles on ow to restore this using the SIMBL plugin. One problem is that you have to kill the Finder and restart it every time you want to have the color icons in the sidebar.

There may be over 200 new features in Lion, but they took a step backwards on some, in my opinion. When I worked for Digital Equipment Corporation for a couple of reasonably unhappy years of my career, I remember Ken Olsen, the CEO telling us that we’d tell the customers what they wanted. He was wrong. DEC told them, but it wasn’t what they wanted, nor needed. DEC had been wildly successful, but started to believe their own press and ended up not giving customers what they wanted or needed and the world overtook them.

Now, I’m not saying that this is what’s happening with Apple. Certainly Steve Jobs told the customers what they wanted. Usually, he was right. He definitely understood the customers they were going after. Developers, like me, are not the main customer base for Apple. But, they have done a good job of giving us developers a good set of tools; especially now that they have a Linux-based OS. Jobs had a vision and he carried it out. The music world, mobile platforms, and many other things changed because he said that it was what people would want—and he was right. I hope Apple can keep doing this.

But, with the Lion experience that I’m having, I just have this nagging feeling that there a little bit of rot creeping into the apple. I hope they can stop it from expanding.


ANTLR and Ruby—Perfect Together

In Ruby, Tools on January 30, 2012 at 9:33 am

I’ve been playing around with Ruby quite a bit lately. It’s a good language for building tools for my testing course. It’s also a great metaprogramming language. This weekend I found that it’s a pretty good language for building code analyzers and other language tools as well, especially if you have a great language processing tool like ANTLR (ANother Tool for Language Recognition).

I wanted to write a tool that would read in source code for a function (in a simple language that I created) and produce a list of nodes with information about definitions and uses of variables in a textual form that my students would be able to input into their programs for further analysis of control flow and data flow.

I use ANTLR as the tool for building lexers and parsers in my compiler construction course. But I’ve always used it by writing Java and having it generate Java. Certainly my tool could be written in any language since it just had to produce textual output, so writing the ANTLR-based processor in Java was my first thought. Then I thought that it would be fun to learn something new and have some fun and I thought about producing the tool in Ruby.

I knew that ANTLR has an option to produce Ruby and thought I’d take a crack at that. With a little bit of Web searching I found the ANTLR for Ruby project. All I had to do was load a Ruby gem and I was off and running. Well, at least I was off and crawling along.

I spent most of Saturday looking at documentation and understanding what an ANTLR grammar with Ruby actions would look like. I also wanted Ruby because I had the feeling that the plumbing around the resulting lexer and parser would be much simpler to write than it is in Java—and I was right!

I set up my project, initialized a Git repository and starting writing tests for the lexer and parser. This was slow going at first. I was learning and trying to write good code. Most of all though, it was fun.

The documentation on ANTLR for Ruby is pretty minimal. It seems that Kyle Yetter, the author, decided to do something else and there are a lot of missing details missing from the Web page. However, there is more detail and the complete API for antlr3 on the actual project pages. It’s not the best documentation, but it’s enough to get you going. Once I muddled through the basics, I picked up steam and finished the application comfortably on Sunday.

I found that the Ruby grammar file was simple and easy to read. The connections to the parser and lexer, and setting up my own internal structures for capturing nodes, definitions, and uses of variables was almost trivial.

By the end of the weekend I had written the following number of lines (just using wc, so blank lines and comments are include here):

  • The ANTLR grammar file: 239
  • The script to run the processor and produce the output: 36
  • Utility classes for the nodes, def-use entries, and exceptions: 60
  • Unit tests: 285
  • Rake files: 31

So, the total number of lines I actually wrote came to: 651 lines. ANTLR generated 3059 lines of code. I think this is a pretty good output for a weekends hackfest. I’ll post some of the code and the grammar for the language in future posts.

I encourage you to try your hand at something like this. It’s fun and you get to learn a lot. One thing I will say is that if you do this, make sure that you’ve got some really good documentation on the tools you’re going to use available. During the weekend I had both Programming Ruby 1.9 and The Definitive ANTLR Reference open in my PDF reader. I didn’t need the ANTLR book that much. Once you’ve written an ANTLR grammar, it’s pretty straight forward. But I was constantly searching the Ruby book for the right classes and methods. These two, combined with the on-line Ruby documentation made the programming adventure fun and productive.

A Case for Virtualization

In Computer science, OpenSource, Software Engineering, Tools on December 7, 2011 at 9:56 am

Maybe I’m late to the party, but until recently I had not paid much attention to using virtual machines on a regular basis. I figured that since I had a couple of desktop Windows systems and a couple of Mac notebooks, I’d be able to use almost any software I wanted without any problems. Well, that’s probably true, but when you begin to use lots of different software tools, especially open source tools that have many dependencies, things begin to get really sticky. And, when you try to figure out how to put together the software needed for students in your courses…well, let’s just say that if I had any hair left on my head, it would be gone. I am convinced that dependencies are the hobgoblins of open source.

If you’ve read some of my more recent posts, I’ve been playing with Cucumber for use in my upcoming Testing for Developers course. I want the students to be exposed to lots of tools and that means that we have a need for lots of different languages, especially some of the dynamic languages, on the system. While Microsoft makes developing software for Windows relatively easy, it’s not my favorite platform for doing anything outside of the Microsoft ecosystem. I understand that and I’m not complaining. In the words of Bill Belichick, “it is what it is.” So, I do almost all of my development on my Macbook. It’s fast and it’s pretty much Linux (the key words here are pretty much).

If you stick to just Java, C, or C++ it turns out that almost any platform works as well as the other. I mean, if you load MinGW onto a Windows system you have the ability to develop from the C family. Add Eclipse and you’ve got a pretty portable IDE. But, when you start to look out from these languages, things get really murky.

A Concrete Example

Let’s look at a real example of what I’m getting at, setting up Cucumber for the class. We’ll start by just considering my Mac setup. Cucumber is written in Ruby and works really well if you have simple scenarios that you use to exercise some simple Ruby code. By simple Ruby code I mean code that has no GUI, database, Internet, or other features. In other words, code that’s probably not very useful for a real application.

As I went through the examples in The Cucumber Book, I began to need additional components like Sinatra, Capybara, and a many others that I have no experience with. Well, how do you get all of this to work? Do they all really work together seamlessly? Maybe, if you have the right versions and the right versions of Ruby and other gems that you have.

It turns out that I had four versions of Ruby on my system. I had ruby 1.8.7, 1.9.1, 1.9.2, and MacRuby. What’s worse is that I didn’t realize it. Some of them were installed in a standard location (whatever that means). If I reordered the sequence of directories in my PATH, I got a different version of Ruby. I needed some consistency. I was using Ruby 1.9.1 and things were going along fairly well when I needed to get the service_manager gem. I installed is using the standard “gem install” approach. Then, when I tried to use it, there was a problem with readline.

After several hours of searching the Web, tweeting to a few friends, posting on stackoverflow, and The Cucumber Book forum, I got an answer that let me get past this. But things had deteriorated to the point that I had no idea what was installed, where. I had used bundler for some things, gem for others, and done some downloading packages, source code that I built, and other ways of getting the software on my system. I also tried rvm, but frankly by this time I wasn’t sure what I was doing and the documentation wasn’t very helpful. Do you need to be using bash? How do you get it working with other shells? I was sitting on a software stack that was about to come crashing down. I could feel it in my bones.


I decided to remove everything Ruby on my system and start over. Okay, that was another few hours, making sure that I copied everything I removed just in case I really messed it up. I finally was ready to start over. Thanks to Matt Wynne, one of the authors of The Cucumber  Book, I ended up using rvm, the Gemfile from the latest version of the installation appendix in the book, and I got the right version of Ruby installed along with the gems and packages I needed. Now, this wasn’t perfect, but it was manageable. Some of the documentation was a little bit off and I had to get on the rvm IRC channel to ask for help, but I did it.

Enter the Virtual Machine

Well, this was great. I now had a consistent installation on my Mac. Most of my students don’t use Mac. Some use Windows, some use Linux. Since they’re computer science students—advanced undergraduates and graduate students—I’m safe assuming that they know their way around Linux. If I could get the software configured on a Linux virtual machine, I could just give the students the VM image. Then I could either add new software for other topics I expect to cover in the course or simply create a different VM image.

I’ve been using VirtualBox for some time now in order to try out different versions of Linux, running Windows on my Mac, and so on. It’s free, runs on any x86 architecture, and just seems to work pretty well. So, that was the plan. I created the VM, loading on the latest Ubuntu Linux, made sure that had some basic software like Java and a few other components. Then I went through my checklist of how to get the right Ruby and Cucumber installed and in less than a half hour I had it running. Take a snapshot so that if I try to extend the image and mess it up I can back up. Voilà, the students can work without having to lose sleep over getting their system set up. There are other things they’ll lose sleep over that are much more important.

The VM approach, and the ability to snapshot and branch off of different snapshots makes life so much easier for me as a teacher and it will save the students a tremendous amount of time. I’m sure some of the uber-geeks in the class will set up the software on their systems because that’s in their DNA. However, for those who really want to concentrate on the course topics, they can now do that without the hassle of being a system administrator. I will definitely use this approach more in future courses.

Sometimes I feel like an antique: tcsh -> bash

In Software, Tools on November 12, 2011 at 5:34 pm

The other day I was doing something in the Linux shell and one of my students expressed surprise that I was using tcsh. Today, everyone seems to be using the more modern bash, or Bourne Again Shell. I got to thinking why I use tcsh and realized that it’s just so comfortable. As things change so rapidly and there’s so much to learn I tend not to relearn something when I don’t have to.

But then I thought, “am I missing out on something?” Sometimes I think it’s good to just learn a different way of doing things for the fun (or frustration of it). So today I decided to become a little bit better at bash. Now, like any of the shells, the man page for bash is pretty daunting. But all I wanted to do was set up an account on a clean Ubuntu Linux system that I’m preparing for use in a testing course next term. I’m creating a VirtualBox machine image so that all of the students will have the same environment for doing their work. If you haven’t tried VirtualBox, I highly recommend it as a choice for virtualization. I’ve used it hosted on Mac, Windows, and Linux and it works just fine.

So, after I created the VM for Ubuntu I wanted to have some of the things that are familiar to me when I use tcsh as my shell available in bash. Mainly I wanted a prompt that shows something like “gpollice [n]: “, where n is simply the current command in the command history. I also wanted to have some simple shortcuts that I alias, such as ls as a rename for ls -CF and ll for ls -l.

In tcsh, these are simple aliases that look like this in my .tcshrc file:

alias ll ls -l
alias ls ls -CF

The conversion to bash is simple. bash assumes that you want to put your aliases together and the default .bashrc file includes a file called .bash_aliases where you can put all of your aliases. The two aliases above convert directly with the addition of an equal sign separating the alias name and its contents. The aliases I have in .bash_aliases are:

alias g='grep -i'
alias ls='ls -CF'
alias ll='ls -l'
alias la='ls -A'
alias up='cd ..;pwd'

I was pretty confident that the conversion would be pretty easy. I next wanted to get the prompt configured and finding the place in .bashrc was pretty easy. But, I wasn’t sure how I was going to get the history count. In tcsh, I do it this way:

set prompt = "`whoami` [\!]: "

In my .bashrc file I simply inserted the line:

PS1 = "`whoami` [\!]: "

It was just a matter of finding out what I had to change, and it was PS1, instead of prompt.

Now for the hard part. There is a set of aliases that I’ve used since 1986 that I got from Bill McKeeman at the Wang Institute of Graduate Studies. It’s a neat way of bookmarking directories and getting back to them quickly. In tcsh, it’s a set of aliases that look like this:

alias listwork	'ls -al ~/.wrk* | fgrep '.wrk' | sed "s/^.*.\.wrk\.//"'
alias rmwork	'rm -f ~/.wrk."\!*"'
alias setwork	'rm -f ~/.wrk."\!*"; ln -s "`pwd -P`" ~/.wrk."\!*"'
alias work	'cd ~/.wrk."\!*";echo "pwd: `pwd -P`";cd `pwd -P`;ls;'

This simply creates soft links to the directories in my home directory that are prefixed with “.wrk.” It’s an elegant way to do quick navigation. I would be lost without this. Since everything came along pretty easily thus far I was hopeful that I’d have my bash environment set up pretty quickly. WRONG! I spent the afternoon tracking down how to actually get it done. In the process, I learned about shell functions in bash and found that they really are nice and simplify a lot of things. I suspect I’ll now go crazy and put a ton of these into my environment;mdash&but that’s for another time.

Without going into details, the solution to this was to add these lines to my .bash_aliases file.

setwork () { rm -f ~/.wrk.$1; ln -s "`pwd -P`" ~/.wrk.$1; }
listwork () { ls -al ~/.wrk* | fgrep '.wrk' | sed "s/^.*.\.wrk\.//"; }
rmwork () { rm -f ~/.wrk.$1; }
work () { cd ~/.wrk.$1; cd `pwd -P`; echo "pwd: `pwd -P`"; ls; }

The thing to remember is that when you want arguments to shell commands, you can’t use an alias. You need to use a function. But functions are not hard to learn. Once I found the right information it took just a few minutes.

I might become a bash user after all. At least this old duck can still learn new tricks.