Friday, January 20, 2006

Big NAND Flash

It brings a smile to my face to read about someone who gets a brand-new 8GB thumb drive and who's first instinct is to open it up to see what's inside. Click on the title above to read more.

Friday, January 13, 2006

Computer security: still sucky after all these years

The title above links to an interesting article at SearchSecurity.com, indicating that, despite all the attention given to computer security these days, attacks continue unabated. It's not clear from the article if they're distinguishing between attacks made and successfully compromised systems; for instance, the article mentions port scans, which are not indicative of security holes in and of themselves.

It's really astounding to me that this is still such a problem in corporate settings. Anyone thought of demanding secure software? Have any major customers gone to Microsoft and threatened either legal action or a wholesale move to Linux if Windows isn't secured within some reasonable time schedule? Meanwhile, developers continue to add "features" to their software to improve the "user experience" that also improve the cracker experience.

Then there's the basic security measures any company can take, like not allowing users to install software (separate administrator accounts) and disabling features that allow software not installed on the computer to run (CD autorun, various scripts and macros embedded in emails or files, etc). And, in addition to the article's suggestion of thorough background checks on employees, might I suggest treating employees well, so they like the company and won't want to do bad things to it?

And here's the FBI's attitude towards computer crime:

Computer related crime is the third-highest priority in the FBI, above public corruption, civil rights, organized crime, white collar crime, major theft and violent crime.
Does anyone else feel that this is a bit out of whack?

Topics: .

The nonsensical idea of removing bias from academia

I got an email today from David Horowitz with a link to the posting in his blog linked from the title above. The email was in response to my previous article on news from Inside Higher Ed about hearings into the so-called "Academic Bill of Rights" in Pennsylvania; his posting was in response to the Inside Higher Ed article itself. In it, he says that the claims of his retraction of evidence supporting his position have been exaggerated. He also states that he was only questioned for 2 minutes "of the eight hours or so of testimony", which makes his role seem minor indeed. However, later in the article he mentions in passing that he had an hour-long presentation; in other words, he took up 1/8 of the entire testimony time. Not minor at all. He says nothing about the quotes in which he disparages the need for factual information: that fictional accounts are just as useful as facts in support of his position.

And what is his position? In Horowitz's own words:

...my agenda with the Academic Bill of Rights is not to attack "leftwing bias" as my critics claim, but to take politics out of the classroom whether the politics comes from the left or the right.
In this article, I would like to make the argument that the idea of removing bias from academia is fundamentally misguided and indeed nonsensical.

Let me argue by way of analogy with the state of affairs in my own field, computer science. Consider the representation of different operating systems in computers in use in CS departments versus the "outside world". Outside CS departments, somewhere around 90% of computer run Microsoft Windows. I don't know what the percentage is in CS, but it's almost certainly much less (and, even if it weren't, for purposes of this analogy, it won't hurt to assume that that is true). Let's assume that this is evidence for "OS bias" in academia, and that Microsoft or its supporters were to protest this, demanding that students used to running Windows at home not be indoctrinated into other OSes and furthermore that action be taken to remedy this bias, perhaps by monitoring CS department purchasing decisions to remove bias. This is equivalent to the idea of eliminating political bias from academia (many CS folks actually term this a "religious," rather than "political," issue). I would argue that there are two ways of looking at this "OS bias":
  1. The bias arises from technical reasons. In other words, the academic CS environment is fundamentally different than the IT environment outside academia, and thus requires a different mix of OSes. Under this explanation, Windows isn't as appropriate for dominant use within CS departments by dint of certain of its features and thus is used less than it is outside. This is equivalent to saying in the political arena something like, "the anti-intellectual nature of modern American conservatism is not as compatible with academic pursuits as non-academic ones, naturally leading to fewer academics being conservatives." (And I would hypothesize that, if you compared the positions on specific issues among self-identified conservatives in and out of academia, you'd find that the academics have significantly different views than the non-academics.)
  2. The bias arises for social reasons. This argument says that CS faculty are, in general, more knowledgeable about technical matters than people outside of CS departments generally are, and as a result choose Windows less frequently. It's not that the CS environment is different, it's that those with less expertise choose their OS for non-technical reasons (reasons not connected with the characteristics of the OS itself). From the point of view of competing OSes, CS faculty make better, more informed and thoughtful choices, leading to a different mix. This is equivalent to the argument that political bias in academia reflects better education, expertise, and more thought: Political Science, History, etc. faculty are making better political choices than people who have less knowledge and time to think about the issues.
So, which is it? Are there differences in the political mix in and out of academia because conservatism has made itself incompatible with academia? Or is it because, given time to think about issues, education of historical and other contexts, and training in critical thinking, academics by and large see that the emperor has no clothes?

Topics: , .

Wednesday, January 11, 2006

Facts? We don't need no stinking facts!

The title above links to an article at Inside Higher Ed that covers recent events at hearings in Pennsylvania relating to the so-called "Academic Bill of Rights". The basic news is that the major proponent of these laws across the country, David Horowitz, was caught using fabricated stories to support his contention that students are penalized at universities for espousing views contrary to their professors' supposedly liberal biases. Here's Mr. Horowitz:

Everybody who is familiar with universities knows that there is a widespread practice of professors venting about foreign policy even when their classes aren't about foreign policy... the lack of evidence on Penn State doesn't mean there isn't a problem... These are nit picking, irrelevant attacks.
In other words, he doesn't need evidence, his assertion of liberal bias in and of itself is enough justification for writing laws to counter that liberal bias.

Is it any wonder that these laws are, by and large, Orwellian intrusions of politics into education? As for the supposed liberal bias of academia, here's an nice quote from the comments to the article:

And someone really needs to say this: Contemporary American conservatism has come to devalue not only evidence, but knowledge and the [search] for knowledge itself so completely that it is nearly impossible to obtain a graduate degree without learning to think far more clearly and rigorously than contemporary American conservatism will permit. By default, such thinking makes one a 'liberal,' and the statement, 'The campuses are full of liberal professors' is essentially equivalent to 'The campuses are full of professors who have studied, continue to study, and value knowledge.'
As I and others have said before: if academics tend to be liberals, it's because modern conservativism has embraced a mindset that in many ways is the polar opposite of learning, critical thinking, and discovery of new knowledge.

Topics: , .

Monday, January 09, 2006

There's an engineer shortage! No there's not! Yes there is!

The title above links to what appears to be an op-ed piece in The Philadelphia Inquirer written by the Dean of the Duke University Engineering school, Kristina Johnson. It's in response to recent news articles that purport to show that there's an engineer gap between the US and India and China -- that those other countries are graduating many more engineers than we do. Prof. Johnson is touting a study done by her school that argues against this. In a nutshell, it shows that, if type of degree and field of study are taken into account, the US graduates about as many engineers as those other countries. Daniel Drezner has commented on the Duke study already.

As can be expected, everyone uses the latest information in an attempt to support his point of view. Prof. Drezner concludes his comments, saying:
So, to conclude, offshore outsourcing will take place when the tasks can be segmented into discrete, simple and rote tasks, and does not pose a threat to engineers at the B.S. level or above.
This is based on surveys of how "capable of competing in the global outsourcing environment" Chinese and Indian engineers are. Right now. Perhaps not the best information to bet a career on, even assuming it were possible for a labor market study to accurately assess such competitiveness. Of course, if you're a big fan of outsourcing, like Prof. Drezner is, then you'll likely grasp at anything that will diffuse the fundamental illogic that exporting high-paying jobs overseas is actually good for American workers (as opposed to American corporations).

Dean Johnson, on the other hand, starts with her school's study that says that things are fine, as far as graduates right now, and goes on to assert that things will get worse in the future, unless:

If we don't act now and invest in engineering education, we will certainly lose the innovation edge we have enjoyed in this country. But I'd go further. I'd argue that it is our responsibility as good citizens of the planet to educate more engineers to help shoulder these coming challenges.

Is the sky falling? No, not yet. But if our most talented domestic students don't go into engineering, the rest of us will have to prop it up somehow.

I'm all in favor of investing in education, especially in math and science. This education is becoming essential for success in today's society and for full participation (or, at least, competent participation) in democracy. One example of this comes from survey results (PDF) entitled "How Americans view personal wealth vs. how financial planners view this wealth", from the Consumer Federation of America. Part of the survey says:
The surveys also found that more than one-fifth of Americans (21%) -- 38% of those with incomes below $25,000 -- think that winning the lottery represents the most practical way for them to accumulate several hundred thousand dollars.

But, I don't think that improving K-12 science and math education is going to "fix" any problem in engineering, because the problem isn't in the number of qualified students. As I've written before, there are plenty of good, qualified students; they're just not choosing engineering careers.

Topics: , .

Saturday, January 07, 2006

What's the journal equivalent of a spamference?

Is it a "spamnal"? I just received the following email invitation:

Dear Professor,

Editors/Associate Editors are required in the fields
of computer science for the journal “Antarctica
Journal of Mathematics”. Are you willing to join as an
editor/associate editor for the journal “ANTARCTICA
JOURNAL OF MATHEMATICS”? 
At first, this seems like a flattering invitation. But, why the Antarctica journal of math? Are there many mathematicians in Antarctica? Or does this journal focus on math issues of relevance to Antarctica (perhaps global climate change)? Then I looked at the "To:" and "Cc:" lines of the email. There were addresses for folks at various universities, but my address was nowhere to be found. This, of course, is one of the hallmarks of spam. A quick check on google revealed a brief posting at the Annals of Improbable Research blog and a brief LiveJournal discussion suggesting that maybe the use of Antarctica was a marketing ploy. There's also the journal's web site, at a free web hosting service, which I'll not link to. There was one issue of the journal in 2004 and two in 2005. Most of the publications are from India, and in fact a large percentage are from people associated with the journal. The overall site design follows the "use lots of colors" philosophy. I think I'll pass on this opportunity. But if Science or Nature needs editors, I'd be happy to entertain an invitation.

Topics: , .

More on software development laziness

In my previous posting, I commented on the trend of developers using customer machine cycles in an attempt to absolve themselves from the need to produce quality code. If you follow the link from the title above, you'll see a sidebar written by Eugene Spafford entitled, "Are the bad guys winning?" Prof. Spafford comments on the current lack of attention given to security in software development. To me, this is just another example of the same sort of reasoning: the only thing that counts in software development is the bottom line of the next upgrade cycle.

Now, while I might object to the idea of software developers burning my computer's cycles to compensate for fundamental laxities in their development processes (for example, by using managed languages, so their programmers don't need to worry about freeing up allocated memory themselves), I at least recognize that this is a feasible approach. In other words developers really can compensate for poor processes by using this approach.

Such is not the case for security. Burning my cycles does nothing for making computer systems more secure. The only thing that will do this, as prof. Spafford writes, is hiring people with real security knowledge and skill and paying attention to development processes that foster security. I would argue that these are the same steps that would foster more efficient, higher quality code. Which of the following three potential futures (from Spafford) do you think will happen:

In the first, the market realizes the cost of tacking security onto systems as an afterthought, and demands and compensates vendors for simpler, more secure systems... The second outcome is that we limit our use of information technology to avoid security-related problems. The third outcome is that we continue on our merry way until the system implodes.

Topics: ,, .

Friday, January 06, 2006

Give me back my cycles!

One of the things you do when teaching object-oriented programming (especially when you learned to program in the structured programming era) is spend time thinking about how programming paradigms come and go and how much of the impetus to change has been restricting what programmers can easily do so that they have less leeway to get into trouble (hopefully meaning fewer bugs). Then my colleague Daniel Lemire goes and questions the utility of debuggers, and I wonder if all of these linguistic and developmental tools merely serve to make software developers lazy. Why bother to really understand your code (for instance, trace it by hand) when you can scan through it while it runs? Who cares how inefficient language X is; machines are fast!

This is my pet peeve: software manufacturers are burning my machine cycles, which I paid for with my hard-earned money, to reduce their effort in producing quality code. If you don't believe it, go to The Computer Shootout Benchmarks and compare the trendiest, most "modern" languages' performance with C. The best you'll find is C++, which looks to be about 30% slower than C in general. C#/Mono? C is 2-70 times faster. Java? C is 2-66 times faster. And C is hundreds of times faster than Ruby et al.

What do I get for all these extra cycles? The only thing I can see that is of value to me is platform independence. Is it worth buying a machine twice as fast as I might otherwise need? Maybe. Four times as fast? I'm not so sure. Seventy times as fast? I'm sorry. Where are all those extra cycles going? To absolving companies from the need to discipline their development processes. In other words, I'm paying more money for a computer so companies don't need to pay as much attention to the fine details of product quality.

Disclaimer: The above doesn't apply to computation-intensive applications, for which the more cycles the merrier. I don't play video games, but I do write simulation code. But should you really need a multi-gigahertz machine to write a letter? I wrote a 200-page dissertation, with drawings, graphs, and all sorts of typeset math, on a Mac SE (using LATEX).

Topics: , , .

Thursday, January 05, 2006

Battlestar Galactica: style over coherence

I've avoided watching Battlestar Galactica, frankly because I couldn't bring myself to watch a remake of a not even mediocre TV series. (As an aside, by what brand of logic does the Sci Fi channel reason that such a remake makes more sense than reviving a show like Firefly?) Anyway, given that I avoided the Stargate series for a similar reason (remake of a mediocre SF movie mostly memorable because you could almost see the telephone Kurt Russell used to phone his role in on), while later learning that the TV series was far superior, I tuned in tonight. I wasn't impressed.

The first thing I noticed was the style of using hand-held cameras. By this, I mean like the home video my wife shot before we bought a camcorder with image stabilization. It's not just annoying; it almost made me dizzy. It certainly made it difficult to track what was going on.

The next thing I noticed was that the military uniform in the series seems to include a liberal coating of blood, preferably over the face. I'm not sure what the characters in the series are doing, but they might want to use their hands more to keep things from smacking them in the face. Or put on football or hockey helmets with face masks.

Then there's the scenes on "Cylon occupied Caprica", in which the colors are all strange and the contrast is way too high. I guess Cylon occupation somehow changes not only the color of sunlight but also the ability of human eyes to adjust to it.

The Cylons are robots. Or they were in the old series. In this series, some are robots and some look like people, frequently platinum blondes in tight clothes. The characters talk about the human-appearing Cylons being machines, but then there's one that gets pregnant from a human. I may not be an obstetrician, but I don't see how that's possible, or even how it makes sense. Are they like goodlife from Fred Saberhagen's Berserker books? Then they wouldn't be machines. It's not just confusing, it just doesn't make any sense, no matter how hard I try to suspend my disbelief.

The final flaw I see is simply the size of the cast. There's so many characters, and so much cutting in between their stories, that I have trouble keeping track of them (the fact that their faces are often obscured by a coating of blood doesn't help). This doesn't make it easy to develop any empathy for the characters.

So, that's my first impression. Lots of "style" (jerky camera work, strange lighting, blood for "grittiness") and lots of complexity (many characters, multiple locations, ships full of twisty corridors, all alike and all strangely empty) make for a confusing, un-captivating experience. We'll give it another couple chances, but I'm not optimistic.

Topics: , .

Monday, January 02, 2006

For every thing, there is a season

Like a number of other professions, academia has an annual rhythm tied to the seasons. Part of that rhythm is the sequence of calls for papers from conferences. And so, I must once again comment on the "World Multi-Conference on Systemics, Cybernetics and Informatics" (WMSCI), as the internet is flooded with their annual spam emails inviting participation (hence earning them the appellation "spamference").

If you've done some work and want to let other people know about it, spend some time deciding on the appropriate venue. Sticking it on the web yourself is much more respectable than paying to make it seem like your paper was reviewed (when it likely wasn't).

Sunday, January 01, 2006

Some interesting Unix tools

Follow the link from the title to some useful Unix tools, courtesy of Tom Schneider at the National Institutes of Health. I find "atchange" and "nowhere" especially interesting.