Archive for November, 2011|Monthly archive page

Sometimes I feel like an antique: tcsh -> bash

In Software, Tools on November 12, 2011 at 5:34 pm

The other day I was doing something in the Linux shell and one of my students expressed surprise that I was using tcsh. Today, everyone seems to be using the more modern bash, or Bourne Again Shell. I got to thinking why I use tcsh and realized that it’s just so comfortable. As things change so rapidly and there’s so much to learn I tend not to relearn something when I don’t have to.

But then I thought, “am I missing out on something?” Sometimes I think it’s good to just learn a different way of doing things for the fun (or frustration of it). So today I decided to become a little bit better at bash. Now, like any of the shells, the man page for bash is pretty daunting. But all I wanted to do was set up an account on a clean Ubuntu Linux system that I’m preparing for use in a testing course next term. I’m creating a VirtualBox machine image so that all of the students will have the same environment for doing their work. If you haven’t tried VirtualBox, I highly recommend it as a choice for virtualization. I’ve used it hosted on Mac, Windows, and Linux and it works just fine.

So, after I created the VM for Ubuntu I wanted to have some of the things that are familiar to me when I use tcsh as my shell available in bash. Mainly I wanted a prompt that shows something like “gpollice [n]: “, where n is simply the current command in the command history. I also wanted to have some simple shortcuts that I alias, such as ls as a rename for ls -CF and ll for ls -l.

In tcsh, these are simple aliases that look like this in my .tcshrc file:

alias ll ls -l
alias ls ls -CF

The conversion to bash is simple. bash assumes that you want to put your aliases together and the default .bashrc file includes a file called .bash_aliases where you can put all of your aliases. The two aliases above convert directly with the addition of an equal sign separating the alias name and its contents. The aliases I have in .bash_aliases are:

alias g='grep -i'
alias ls='ls -CF'
alias ll='ls -l'
alias la='ls -A'
alias up='cd ..;pwd'

I was pretty confident that the conversion would be pretty easy. I next wanted to get the prompt configured and finding the place in .bashrc was pretty easy. But, I wasn’t sure how I was going to get the history count. In tcsh, I do it this way:

set prompt = "`whoami` [\!]: "

In my .bashrc file I simply inserted the line:

PS1 = "`whoami` [\!]: "

It was just a matter of finding out what I had to change, and it was PS1, instead of prompt.

Now for the hard part. There is a set of aliases that I’ve used since 1986 that I got from Bill McKeeman at the Wang Institute of Graduate Studies. It’s a neat way of bookmarking directories and getting back to them quickly. In tcsh, it’s a set of aliases that look like this:

alias listwork	'ls -al ~/.wrk* | fgrep '.wrk' | sed "s/^.*.\.wrk\.//"'
alias rmwork	'rm -f ~/.wrk."\!*"'
alias setwork	'rm -f ~/.wrk."\!*"; ln -s "`pwd -P`" ~/.wrk."\!*"'
alias work	'cd ~/.wrk."\!*";echo "pwd: `pwd -P`";cd `pwd -P`;ls;'

This simply creates soft links to the directories in my home directory that are prefixed with “.wrk.” It’s an elegant way to do quick navigation. I would be lost without this. Since everything came along pretty easily thus far I was hopeful that I’d have my bash environment set up pretty quickly. WRONG! I spent the afternoon tracking down how to actually get it done. In the process, I learned about shell functions in bash and found that they really are nice and simplify a lot of things. I suspect I’ll now go crazy and put a ton of these into my environment;mdash&but that’s for another time.

Without going into details, the solution to this was to add these lines to my .bash_aliases file.

setwork () { rm -f ~/.wrk.$1; ln -s "`pwd -P`" ~/.wrk.$1; }
listwork () { ls -al ~/.wrk* | fgrep '.wrk' | sed "s/^.*.\.wrk\.//"; }
rmwork () { rm -f ~/.wrk.$1; }
work () { cd ~/.wrk.$1; cd `pwd -P`; echo "pwd: `pwd -P`"; ls; }

The thing to remember is that when you want arguments to shell commands, you can’t use an alias. You need to use a function. But functions are not hard to learn. Once I found the right information it took just a few minutes.

I might become a bash user after all. At least this old duck can still learn new tricks.


Agile and the second chasm: history does repeat itself

In Computer science, Software Engineering on November 8, 2011 at 5:05 pm

Kent Beck posted an interesting essay on the Agile Focus blog back in February. I think it’s a really insightful, and important statement from one of the Agile elders (sorry Kent, but accept the fact that we’re getting older). Although this is eight months later, I just read it due to a re-tweet of the link by one of the people I follow. A downside of the Web and the age of instant communication is that there is an exponential explosion of content, much—maybe most—of it is simply noise.

I’m glad that I came across this post though. It made me think hard about what he’s saying. I have found that over the years, I agree with Kent much more than I disagree with him. I remember being on a panel at conference back around 2002 on XP, Agility, and process in general and it seemed that Kent and I were on the same side, even though I was there officially representing Rational and the Rational Unified Process team.

The other panelists, all Agile consultants, were promoting dogmatic adherence to all of the XP practices. They indicated that they forced all team members to do all of the practices all of the time when they were engaged to help teams us XP. Well, if you look at XP, that’s what you’re supposed to do. But, that doesn’t always work. You need to use some common sense.

Kent and I were suggesting that the specific set of practices and how dogmatic their application should be appropriate, not just a blind application from a book or some other source. In short, it would be better for the team to succeed by adopting a handful of practices and using them in a way that suited the team than it would be for them to turn all the XP practice dials up to 10 and fail.

During the Q&A period, Mary Poppendieck made the comment that except for Kent and I, the other panelists sounded like the process police. I found this quite funny. A significant part of the Agile movement, besides trying to discover better ways of delivering software, was to combat RUP and other process products that were deemed to be too restrictive and dogmatic (they also had a significant portion of the market). Getting into the Agile camp allowed consultants who worked individually or in fairly small companies gain traction in places where they might not otherwise have been able to compete. Certainly large companies were reluctant to get away from the IBM Rational security blanket. They may not always succeed, but they followed the conventional wisdom of “you can’t go wrong with IBM.”

So, here we are more than ten years after the Agile Manifesto was signed. Are things different? Well, yes and no. Certainly we have added many new tools to our software engineering toolbox. We look at software development differently in certain areas. We are tend not to inflict heavyweight methods when lighter ones will do. In fact, we are probably guilty of erring to the opposite end of the spectrum now and not formalizing things when we probably should. After all, we don’t want people to think that we’re not Agile!

One might argue that many of the changes we’ve seen would have occurred naturally, without the Agile movement. But, however the changes have come about, I think we have a better set of tools—intellectual and otherwise—to use in our craft. What I don’t think has changed to a large extent is the ability for software developers and managers to use common sense when deciding how to work. They still want someone to tell them what to do and how to do it.

At the end of the 1990s, the RUP was sold as a process framework that you could customize to your team, the type of project you had, and the overall organization. Customers who did were usually quite successful. Those who thought they had to do everything in the several thousand pages of advice in the RUP had some spectacular failures. Of course Rational, and later IBM, was happy to send consultants to your company to help you figure out the right configuration. That is, someone would tell you what to do, how, and when.

Today, Agile consultants come and help you adopt Scrum (certified Scrum Masters), or XP, or Lean, or Kanban, or … . It doesn’t matter what. We don’t want to think and reason about what’s best for us and our project teams. We want someone to tell us. Perhaps we think it will allow us to avoid taking responsibility for the outcome of our projects; never mind that a cornerstone of Agility, or any reasonable methodology, is reflection by the team and adjusting the process. That seems like too much (non-productive) work. We can hire someone to tell us what to think and how to do our jobs. After all, they’re the experts, the consultants who have the experience.

So, we’re right back where we were ten or twenty years ago. We let people tell us what’s good for us. We don’t think. We let others do that for us. We’re always riding the crest of the next wave, be it XP, Scrum, Lean, or whatever. If we’re current, we must be doing the right thing. That may be true only if the current thing actually applies to what you’re doing. In 1999, Alistair Cockburn talked about Methodology per Project. I think this essay went unnoticed at the time. That’s a pity. I think there are some real nuggets in what he talks about. Of course he’s a consultant trying to convince you that his Crystal family of processes are right for you. Simply put, every project and project team is different. We don’t manufacture software, because each program is different. If we write the exact same code multiple times then we’re idiots. Ours is a craft. We produce one-of-a-kind products. If our products are one-of-a-kind, then maybe the way we produce them should be as well. That’s the key.

If we don’t start smartening up, we’re going to be in the same place ten years from now. Oh, there will be new methods and tools, but we won’t really know how to use them. We’ll hire someone to tell us what to use and how to use it. Same old, same old. Unless we start thinking for ourselves, reflecting upon our experiences, and taking our own destiny in our hands, we’re doomed as a profession. As the world’s appetite for software increases, we’re going to sit back and hope someone can show us how to do better.

When students get out of my software engineering course is that they are able to evaluate tools and methods and pick the right ones for their projects without someone else telling them what to do.