Thursday, February 05, 2015

A Deep Dive: the Velocity Manufacturing Simulation

In 1989 I graduated from the College of Wooster and then spent a year as an intern with Academic Computing Services there, writing newsletters and little software tools. In the summer of 1990 I moved to Ann Arbor, without a clear idea what I was going to do next.

I worked for a short while with the Department of Anthropology, but by the end of 1990, I had found a job with the Office of Instructional Technology.

OIT was sort of the University's answer to the MIT Media Lab. It was an organization where instructional designers, programmers, and faculty members could work together on projects to bring technology into classrooms. It was a pretty remarkable workplace, and although it is long gone, I am truly grateful for the varied experiences I had there. It was the early days of computer multimedia, a sort of wild west of platforms and tools, and I learned a lot.

In January of 1993 my girlfriend and her parents visited my two workplaces, OIT headquarters and the Instructional Technology Lab, a site in the Chemistry building. I handed my girlfriend a video camera and proceeded to give a very boring little talk to her, and her extremely patient parents. Wow, I was a geek. I'd like to think my social skills and ability to make eye contact are a lot better now, but I probably haven't changed as much as I imagine that I have. I'm an extraverted geek now: when I am having a conversation with you, I can stare at your shoes.

I have carried the original analog Hi-8 videocassette around through many moves, and life changes, and only today figured out a good way to get it into my computer -- after giving the camcorder heads a very thorough cleaning. I thought the tape was pretty much a lost cause, and was going to try working with my last-ditch backup, a dub to VHS tape, but I'm pleased to learn that the video is still playable, and pleased that I could finally get this made, such as it is.

This project, the Velocity Manufacturing Simulation, was written in Visual BASIC, long before it became VB.NET. I remember that it involved a fair amount of code, although I don't have the source to look at. I remember painstakingly writing code for GUI elements like the animated disclosure triangles. There was some kind of custom controls library we bought separately; the details escape me. There was some kind of ODBC (maybe?) database plug-in that I can barely recall; I think Pete did most of the work on that part. Pete wrote parts of it, and I wrote parts of it. Now it seems almost laughably primitive, but you'll just have to take my word for it that back in the day it seemed pretty cool. It won an award. As far as I know, this is the only video footage of the project.

The code is 147 years old in Internet years. It was almost half my lifetime ago. But at the same time it seems like I just left that office, and somehow if I could figure out where it was, I could still go back and find everyone there in the conference room having lunch, and after lunch settle back into my old office with the vintage, antique computers.

This was only one of several projects I worked on while I worked at OIT. I have some other bits of video for a few of them, but not all. I will get clips up for at least one more. I wish there was more tape, and better tape, even if the only one nostalgic about these projects is me.

Perhaps "enjoy" is the wrong word, but take a moment to remember what instructional multimedia was like, a few months before a group called NCSA released a program called Mosaic and the world started to hear about this exciting new thing called the World Wide Web... but grandpa's tired, kids, and that's a story for a different day.

Wednesday, November 13, 2013

Apple Breaks Apache Configurations for Gitit (Again)

I'm not quite sure why I put myself through this, but I upgraded my Mac Pro to Mavericks. This broke my local Gitit Wiki. The symptom was that Apache was unable to start, although nothing would be written in the error logs. To determine what was wrong I used sudo apachectl -t. The installer did preserve my http.conf, but wiped out the library that I had installed in /user/libexec/apache2. See this old entry that I wrote back when I fixed it for Mountain Lion here.

I installed XCode 5 and I thought I was set, but there is more breakage. You might need to run xcode-select --install to get headers in /usr/include. The makefile /usr/share/httpd/build/ is still broken in Mavericks, so commands like sudo apxs -ci -I /usr/include/libxml2 mod_xml2enc.c won't work.

To make a long story short, I got the latest (development) version of the mod_proxy_html source, these commands worked for me:

sudo /usr/share/apr-1/build-1/libtool --silent --mode=compile --tag=CC /usr/bin/cc -DDARW
IN -DSIGPROCMASK_SETS_THREAD_MASK -I/usr/local/include -I/usr/include/apache2 -I/usr/include/apr-1 -I/usr/include/libxml2 -I
. -c -o mod_xml2enc.lo mod_xml2enc.c && sudo touch mod_xml2enc.slo


sudo /usr/share/apr-1/build-1/libtool --silent --mode=compile --tag=CC /usr/bin/cc -DDARW
IN -DSIGPROCMASK_SETS_THREAD_MASK -I/usr/local/include -I/usr/include/apache2 -I/usr/include/apr-1 -I/usr/include/libxml2 -I
. -c -o mod_proxy_html.lo mod_proxy_html.c && sudo touch mod_proxy_html.slo

Previously, this gave me .so files in the generated .libs directory, but now I just have .o files and I'm not sure that's what I want.

Sunday, August 11, 2013

More Crappy Print-on-Demand Books -- for Shame, Addison-Wesley "Professional"

So, a while back I wrote about some print-on-demand editions that didn't live up to my expectations, particularly in the area of print quality -- these Tor print-on-demand editions.

Now, I've come across one that is even worse. A few days ago I ordered a book from Amazon called Imperfect C++ by Matthew Wilson -- it's useful, thought-provoking material. Like the famous UNIX-Hater's Book, it's written for people with a love-hate relationship with the language -- that is, those who have to use it, and who desperately want to get the best possible outcomes from using it, writing code that is as solid and portable as possible, and working around the language's many weaknesses. (People who haven't use other languages may not even be aware that something better is possible and that complaints about the language are just sour grapes; I'm not really talking to those people).

The universe sometimes insists on irony. My first copy of Imperfect C++ arrived very poorly glued; the pages began falling out as soon as I opened the cover and began to read. And I am not hard on books -- I take excellent care of them.

So I got online and arranged to return this copy to Amazon. They cross-shipped me a replacement. The replacement is even worse:

Not only are the pages falling out, because they were not properly glued, but the back of the book had a big crease:

So I guess I'll have to return both.

I'll look into finding an older used copy that wasn't print-on-demand. But then of course the author won't get any money.

Amazon, and Addison-Wesley, this is shameful. This book costs $50, even with an Amazon discount. I will be sending a note to the author. I'm not sure there is much he can do, but readers should not tolerate garbage like this. Amazon, and Addison-Wesley, fix this! As Amazon approaches total market dominance, I'm reminded of the old Saturday Night Live parody of Bell Telephone: "We don't care. We don't have to. We're the Book Company."

Thursday, August 01, 2013

Arduino, Day 1

A friend of mine sent me a RedBoard and asked me to collaborate with him on a development idea. So I'm playing with an Arduino-compatible device for the first time. I've been aware of them, but just never got one, in part because after writing embedded code all day, what I've wanted to do with my time off is not necessarily write more embedded code.

I downloaded the Arduino IDE and checked that out a bit. There are some things about the way it's presented that drive me a little batty. The language is C++, but Arduino calls it the "Arduino Programming Language" -- it even has its own language reference page. Down at the bottom the fine print says "The Arduino language is based on C/C++."

That repels me. First, it seems to give the Arduino team credit for creating something that they really haven't. They deserve plenty of credit -- not least for building a very useful library -- but not for inventing a programming language. Second, it fails to give credit (and blame) for the language to the large number of people who actually designed and implemented C, C++, and the GCC cross-compiler running behind the scenes, with its reduced standard libraries and all. And third, it obfuscates what programmers are learning -- especially the distinction between a language and a library. That might keep things simpler for beginners but this is supposed to be a teaching tool, isn't it? I don't think it's a good idea to obfuscate the difference between the core language (for example, bitwise and arithmetic operators), macros (like min), and functions in the standard Arduino library. For one thing, errors in using each of these will result in profoundly different kinds of diagnostic messages or other failure modes. It also obfuscates something important -- which C++ is this? Because C++ has many variations now. Can I use enum classes or other C++11 features? I don't know, and because of the facade that Arduino is a distinct language, it is harder to find out. They even have the gall to list true and false as constants. If there's one thing C and C++ programmers know, and beginners need to learn quickly, it's that logical truth in C and C++ is messy. I would hate to have to explain to a beginner why testing a masked bit that is not equal to one against true does not give the expected result.

Anyway, all that aside, this is C++ where the IDE does a few hidden things for you when you compile your code. It inserts a standard header, Arduino.h. It links you to a standard main(). I guess that's all helpful. But finally, it generates prototypes for your functions. That implies a parsing stage, via a separate tool that is not a C++ compiler.

On my Mac Pro running Mountain Lion, the board was not recognized as a serial device at all, so I had to give up using my Mac, at least until I can resolve that. I switched over to Ubuntu 12.04 on a ThinkPad laptop. The IDE works flawlessly. I tried to follow some directions to see where the code was actually built by engaging a verbose mode for compilation and uploading, but I couldn't get that working. So I ditched the IDE.

This was fairly easy, with the caveat that there are a bunch of outdated tools out there. I went down some dead ends and rabbit holes, but the procedure is really not hard. I used sudo apt-get install to install arduino-core and arduino-mk.

There is now a common makefile in my /usr/share/arduino directory and I can make project folders with makefiles that refer to it. To make this work I had to add a new export to my .bashrc file, export ARDUINO_DIR=/usr/share/arduino (your mileage may vary depending on how your Linux version works, but that's where I define additional environment variables).

The Makefile in my project directory has the following in it:

BOARD_TAG    = uno
ARDUINO_PORT = /dev/serial/by-id/usb-*
include /usr/share/arduino/
And nothing else! Everything else is inherited from the common I can throw .cpp and .h files in there and make builds them and make upload uploads them.

If you have trouble with the upload, you might take a look at your devices. A little experimentation (listing the contents of /dev before and after unpluging the board) reveals that the RedBoard is showing up on my system as a device under /dev/serial -- in my case, /dev/serial/by-id/usb-FTDI_FT232R_USB_UART_A601EGHT-if00-port0 and /dev/serial/by-path/pci-0000:00:1d.0-usb-0:2:1.0-port0 (your values will no doubt vary). That's why my Makefile reads ARDUINO_PORT = /dev/serial/by-id/usb-* -- so it will catch anything that shows up in there with the usb- prefix. If your device is showing up elsewhere, or you have more than one device, you might need to tweak this to properly identify your board.

When you look at the basic blink demo program in the Arduino IDE, you see this, the contents of an .ino file (I have removed some comments):

int led = 13;

void setup() {                
    // initialize the digital pin as an output.
    pinMode(led, OUTPUT);     

// the loop routine runs over and over again forever:
void loop() {
    digitalWrite(led, HIGH);   // turn the LED on (HIGH is the voltage level)
    delay(1000);               // wait for a second
    digitalWrite(led, LOW);    // turn the LED off by making the voltage LOW
    delay(1000);               // wait for a second

The Makefile knows how to build an .ino file and inserts the necessary header, implementation of main, and generates any necessary prototypes. But if you want to build this code with make as a .cpp file, it needs to look like this:

#include <Arduino.h>

int led = 13;

void setup() {
    // initialize the digital pin as an output.
    pinMode(led, OUTPUT);

// the loop routine runs over and over again forever:
void loop() {
    digitalWrite(led, HIGH);   // turn the LED on (HIGH is the voltage level)
    delay(1000);               // wait for a second
    digitalWrite(led, LOW);    // turn the LED off by making the voltage LOW
    delay(1000);               // wait for a second

int main(void)

#if defined(USBCON)


    for (;;) {
        if (serialEventRun) serialEventRun();

return 0;


And there it is -- C++, make, and no IDE. Relaxen and watchen Das blinkenlights!

Tuesday, July 30, 2013

Lexx is Wretched

I have a fondness for science fiction series that are imaginative but not, as a whole, successful. Farscape, I'm talking about you. Even, occasionally, those that start out promising, but which turn into complete failures -- failure can occasionally be interesting. At least, it serves as an object lesson for how a story line can go so very far wrong. Andromeda, I've got your number. I can deal with very dated CGI -- Babylon Five is still generally good and often great. So I happened to come across discounted boxed sets of Lexx, the whole series, at my local Target store. They were dirt cheap. "How bad could it be?" I thought. Well, now I know. At least, I know part of the story.

First off, Lexx is not something I can show my kids -- pretty much at all. Season 1 has a surprising amount of very fake gore in it -- brains and guts flying everywhere. That didn't really bother them -- I think they got that the brains were made of gelatin -- but it was getting to me. Watching characters carved up by rotating blades, repeatedly; watching characters getting their brains removed -- that got old. Body horror, body transformation -- pretty standard stuff for B grade science fiction, or anything that partakes of the tropes of such, but not actually kid-friendly. So we didn't continue showing the kids.

Still, I thought it might make more sense to watch them in order, so I watched the second two-hour movie (1:38 without commercials). The second one has full frontal nudity, which startled me a bit. I'm not really opposed to looking at a nubile young woman, per se. There is some imaginative world-building and character creation here, but ultimately it's just incredibly boring. It's like the producers shot the material, not having any idea how long the finished product would be; they shot enough scenes to actually power an hour show (forty-plus minutes without commercials), but also shot a bunch of extended padding sequences, "just in case." And so after a repeated intro that lasts just under four minutes, we get a two-hour show with endless cuts to spinning blades slowly approaching female groins, huge needles slowly approaching male groins, countdown timers counting down, getting stopped, getting started, getting stopped... endless fight scenes, endless scenes of the robot head blathering his love poetry, a ridiculous new character eating fistfuls of brains... et cetera, et cetera, et cetera.

Every time something happens, I'd get my hopes up, thinking that maybe the writing has actually improved, but then it's time to slow down the show again, because we've still got an extra hour and twenty minutes to pad. And it's all distressingly sexist and grotesquely homophobic. Again, I'd be lying if I said that I didn't like to look at Eva Habermann in a miniskirt, but given that the actress is actually young enough to be my daughter, and especially given that she has so little interesting to do, and there's just not much character in her character -- it's -- well, "gratuitous" doesn't even begin to cover it. She's young, but Brian Downey was old enough to know better. And let's just say I'm a little disgusted with the choices the show's producers made. The guest stars in Season 1 are like a who-used-to-be-who of B actors -- Tim Curry, Rutger Hauer, Malcom McDowell. There's material here for a great cult show -- but these episodes are mostly just tedious. They're actually not good enough to be cult classics.

The season consists of four two-hour movies. After watching the first movie, I didn't quite realize all four season one movies were on one disc, so when I tried to watch some more, I put in the first disc of season two by mistake. I watched the first few episodes of season two -- these are shorter. I didn't notice any actual continuity issues. In other words, nothing significant changes from the pilot movie to the start of season two. There are some imaginative satirical elements. Season 2, episode 3 introduces a planet called "Potatohoe" which is a pretty funny satire of the American "right stuff" tropes. But it's too little, and it amounts to too little, amidst the tedious general adolescent sex romp. Then we lose Eva Habermann, who was 90% of the reason I even watched the show this far. I'm honestly not sure if I can watch any more.

It doesn't help that several of the discs skip a lot. It might have something to do with the scratches that were on the discs when I took them out of the packaging, which come from the fact that the discs are all stuck together on a single spindle in the plastic box. And the discs themselves are all unmarked, identifiable only by an ID number, not any kind of label indicating which part of which season they hold -- so good luck pulling out the one you want.

I'm told the later seasons have some very imaginative story lines. People say good things about the third season. It seems like the universe has a lot of potential. Is it worth continuing, or am I going to be in old Battlestar Galactica's second season territory?

UPDATE: I have continued skimming the show. The scripts seem to get somewhat more interesting around season 2, episode 5, called "Lafftrak." It finally seems to take its darkness seriously enough to do something interesting with it, and not just devolve to pornographic settings. The pacing is still weak, but the shows start to feel as if they have a little bit of forward momentum. Of course, then in the next episode, we're back to Star Whores and torture pr0n...

Wednesday, July 24, 2013

The Situation (Day 135)

So, it's day 135. This is either the last covered week (week 20) of unemployment benefits, or I have three more; I'm not quite sure. Without a new source of income, we will run out of money to cover mortgage payments either at the end of September or the end of October. We have burned through the money I withdrew from my 401K in March when I was laid off. I've been selling some possessions, guitars and music gear, but this is demoralizing, and not sustainable. We don't have much more that is worth selling.

I was fortunate to have a 401K to cash out, and to get the food and unemployment benefits I've gotten -- so far I have been able to pay every bill on time and my credit rating is, so far, completely unscathed. But winter is coming. And another son is coming -- Benjamin Merry Potts, most likely around the middle of October.

Emotionally, the situation is very confusing. On the one hand, I have several very promising job prospects, and I'm getting second phone interviews. But these are primarily for jobs where I'd have to relocate, and a small number of possible jobs that might allow me to work from home. This includes positions in Manhattan and Maine. We're coming to grips with the fact that we will most likely have to leave Saginaw. It's a well-worn path out of Saginaw. We were hoping to stick with the road less traveled, but we can't fight economic reality single-handed. And we don't really have any interest in relocating within Michigan, again. If we're going to have to move, let's move somewhere where we won't have to move again -- someplace where, if I lose one job, there's a good chance I can quickly find another.

So, we are willing to relocate, for the right job in the right place. The right place would be the New England area -- Grace is fed up here, and I am too. Maine, Vermont, New Hampshire, Massachusetts, Connecticut, New York, or eastern Pennsylvania are all appealing. but it would not be a quick and easy process. It would probably involve a long separation from my family. I don't relish that idea, especially if my wife has a new baby. That might be what it takes, though. I'll do it for the right job and the right salary and the right place. In any case, we can't move with either a very pregnant woman or a newborn. It's would not be a quick and easy process to sell, or even rent out, a house. A benefit to a permanent job in Manhattan is that it would pay a wage that is scaled for the cost of living there. It might be perfectly doable for me to find as cheap a living arrangement there as I can, work there, and send money home. A Manhattan salary would go a long way towards maintaining a household in Michigan, and helping us figure out how to relocate, and I'd probably be able to fly home fairly frequently.

I would consider a short-term remote contract job where I wasn't an employee, and didn't get benefits, and earned just an hourly wage. Let's say it was a four-hour drive away. I'd consider living away from home during the work week, staying in an extended-stay motel, and driving home on weekends. But it would have to pay well enough to be able to do that commute, pay for that hotel, and be able to send money home -- enough to pay the mortgage and bills. A per diem would help, but the contract work like this I've seen won't cover a per diem. We'd need to maintain two cars instead of one. Grace would need to hire some people for housekeeping and child care help. I wouldn't be there to spend the time I normally spend doing basic household chores and helping to take care of the kids.

Would I consider a contract job like that father away -- for example, an hourly job in California? That's tougher. I think I could tolerate seeing my wife and kids only on weekends, if I knew that situation would not continue indefinitely. But if I had to fly out, that probably wouldn't be possible. California has very little in the way of public transportation. Would I have to lease a car out there, so I could drive to a job? Take cabs? It might make more sense to buy a used car, once out there. In any case, it would cost. Paying for the flights, the hotel, and the car, with no per diem, it's hard to imagine that I'd be able to fly home even once a month. Would I do a job like that if I could only manage to see my family, say, quarterly? Let's just say that would be a hardship. I would consider an arrangement like this if it paid enough. But the recruiters who are talking to me about these jobs are not offering competitive market rates. It doesn't seem like the numbers could work out -- I can't take a job that won't actually pay all our expenses.

The prospect of employment locally or within an hour commute continues to look very poor. I've applied for a number of much lower-paying IT or programming jobs in the region, and been consistently rejected. These jobs wouldn't pay enough to afford a long commute or maintain any financial security at all. In fact, I think we'd still be eligible for food stamps (SNAP) and my wife and kids would probably still be eligible for Medicaid. Their only saving grace is that they would pay the mortgage. Some of them might provide health insurance, at least for me. But I've seen nothing but a string of form rejections for these positions.

Grace and I don't get much quiet time -- we haven't had an actual date night, or an evening without the kids, since March. The closest we come is getting a sitter to watch the kids for a couple of hours while we run some errands. That's what we did last Sunday. I made a recording and turned it into a podcast. You can listen if you are interested.

Building a Podcast Feed File, for Beginners

I had a question about how to set up a podcast. I wrote this answer and thought while I was at it, I might as well polish up the answer just a bit and post it, in case it would be helpful to anyone else.

I'm starting a podcast and I need help creating an RSS feed. You're the only person I could think of that might know how to create such a thing. Is there any way you could help me?

OK, I am not an expert on podcasts in general because I've only every created mine. I set mine up by hand. I'll tell you how I do that and then you can try it that way if you want. You might prefer to use a web site that does the technical parts for you.

A podcast just consists of audio files that can be downloaded, and the feed file. I write my feed files by hand. I just have a hosting site at DreamHost that gives me FTP access, and I upload audio files to a directory that is under the root of one of my hosted web site directories. For example:

The feed file I use, I write with a text editor. I use BBEdit, which is a fantastic text editor for the Macintosh that I've used for over 20 years, but any text editor will do. For the General Purpose Podcast, this is the feed file:

The feed file contains information about the podcast feed as a whole, and then a series of entries, one for each episode (in my case, each audio file, although they don't strictly have to be audio files; you can use video files). When I add an audio file, I just add a new entry that describes the new audio file.

This is a slight simplification. I actually use a separate "staging" file for testing before I add entries to the main podcast feed. The staging file contains the last few episodes, and I have a separate subscription in iTunes to the "staging" podcast for testing purposes. When I upload a new episode MP3 file, I test it by adding an entry to the staging index file here:

So I add an entry to test, and then tell iTunes to update the staging podcast. If it works OK and finds a new episode, downloads it, and it comes out to the right length, and the tags look OK, then I add the same entry to the main index file.

I have a blog for the podcast too. That's a separate thing on Blogger, here: That just provides a jumping-off point to get to the episodes, and something I can post on Facebook or Twitter. For each episode I just make a new blog post and write a description and then include a link to the particular MP3 file. The blog in the sidebar also has links to the feeds and to the iTunes store page for the podcast. I'll get to the iTunes store in a minute.

Oh, writing the entry in the feed file is kind of a pain. You have to specify a date, and it has to be formatted correctly and it has to have the right GMT offset which changes with daylight savings time. You have to specify the exact number of bytes in the file and the length in hours, minutes, and seconds. If you get these wrong the file will not be downloaded correctly -- it will be cut off. The URL needs to be URL-escaped, for example spaces become %20, etc.

If I upload the file to my hosting site first, so that I can see the file in my web browser, and copy the link, it comes out URL-escaped for me, so that part is easy. I paste that link to the file into the feed file entry for the episode. The entry gets a link to the file, and then there is a also a UID (a unique ID for the episode). Personally, I use the same thing for both the UID and the link, but they can be different. The UID is how iTunes (or some other podcast reader) decides, when it reads your feed file, whether it has downloaded that file already, or whether it needs to download it again. So it's important to come up with a scheme for UIDs and then never change them, or anyone who subscribes to your podcast will probably either see errors or get duplicated files. In other words, even if I moved the podcast files to a different server, and the link needed to be changed, I would not change the UIDs of any of the existing entries.

Once you have your feed file, you can check it with the feed validator -- and you definitely should do this before giving it out in public or submitting it to the iTunes store. See I try to remember to check mine every so often just to make sure I don't have an invalid date or something like that. If the feed is not working, this thing might tell you why.

OK, the next thing is iTunes integration. The thing to keep in mind here is that Apple does not host any of your files or your feed. You apply to be in the podcast directory, and then someone approves it, and the system generates a page for you on Apple's site. Once a day or so it reads your feed file and updates that page. The point here is that if someone is having problems with your page on iTunes, it is probably not Apple's fault, it is probably a problem with your feed or your hosted audio files.

If you don't want to do this all manually there are sites that will set up your feed for you automatically, like and I am not sure which one is best and I have not used them.

This is Apple's guide that includes information on how to tag your files in the feed -- you could start out with mine as an example, but this is the de facto standard for writing a podcast feed that will work with iTunes and the iTunes store:

OK, now you know just about everything I know about it. Oh, there is one more thing to talk about. This part is kind of critical.

So you create an audio file -- I make a WAV file and then encode it into an MP3 file either in Logic or in iTunes. My recent spoken word files are encoded at 128 Kbps; if I'm including music I would use a higher bit rate. Some people compress them much smaller, but I am a sticker about audio quality and 128 Kbps is about as much compression as I can tolerate.

You then have to tag it. This actually changes data fields in your MP3 file. The tagging should be consistent. You can see how my files look in iTunes. If the tagging is not consistent then the files will not sort properly -- they won't group into albums or sort by artist and that is a huge pain. When files get scattered all over your iTunes library, it looks very unprofessional and I tend to delete those podcasts. But note that the tags you add are not quite as relevant as they would be if you were releasing an album of MP3 files, and here's why -- podcasts have additional tags that are added by your "podcatcher" -- iTunes, or some other program that downloads the podcast files.

So you tag your MP3 file, and take note of the length (the exact length in bytes and the length in hours, minutes, and seconds), so that you can make a correct entry in your feed file. The MP3 file is the file you upload, but note that this file is not actually a podcast file yet. It doesn't show up in "Podcasts" under iTunes. It becomes a podcast file when iTunes or some other podcatcher downloads it. iTunes reads the metadata from the feed file (metadata is data about a file that is not in the file itself) and it uses parts of that metadata, like the podcast name, to adds hidden tags to the MP3 file. Yes, it changes the file -- the MP3 file on your hard drive that is downloaded will not be exactly the same file you put on the server. This is confusing. But it explains why if you download the MP3 file directly and put it in your iTunes library, rather than letting iTunes download it as a podcast episode, it won't "sort" -- that is, it won't show up as an iTunes podcast under the podcast name.

At least, that has been true in the past. I think recent versions of iTunes have finally made it so there is an "advanced" panel that will let you tell iTunes that a file is a podcast file, but sorting it into the proper podcast this way might still be tricky. So the key thing is that you might want to keep both your properly tagged source files, because those are the ones you would upload to your site if, for example, your site lost all its files, or if you were going to relocate your site to a new web server, and also the files after they have been downloaded and tagged as podcasts by iTunes. I keep them separately. If someone is missing an episode I can send them the podcast tagged file and they can add it to their iTunes library and it will sort correctly with the other podcast files.

OK, now you pretty much know everything I know about podcast feeds. I prefer doing it by hand because I'm a control freak -- I like to know exactly what is happening. I like to tag my files exactly the way I want. But if you're not into that -- if you don't know how to upload and download files of various kinds and tag MP3 files, for example -- you probably want to use something like Libsyn. Or maybe you know what to do but just want to save time. I just know that I've sometimes been called on to help people using these services fix their feeds after they are broken, or they need to relocate files, and it isn't pretty, so I'll stick to my hand-rolled feed.

Monday, July 08, 2013

Building Repast HPC on Mountain Lion

For a possible small consulting project, I've built Repast HPC on Mountain Lion and I'm making notes available here, since the build was not simple.

First, I needed the hdf5 library. I used hdf5-1.8.11 from the .tar.gz. This has to be built using ./configure --prefix=/usr/local/ (or somewhere else if you are doing something different to manage user-built programs). I was then able to run sudo make, sudo make check, sudo make install, and sudo make check-install and that all seemed to work fine (although the tests take quite a while, even on my 8-core Mac Pro).

Next, I needed to install netcdf. I went down a versioning rabbit hole for a number of hours with 4.3.0... I was _not_ able to get it to work! Use ./configure --prefix=/usr/local, make, make check, sudo make install.

Next, the netcdf-cxx, the C++ version. I used netcdf-cxx-4.2 -- NOT netcdf-cxx4-4.2 -- with ./configure --prefix=/usr/local/

Similarly, boost 1.54 had all kinds of problems. I had to use boost 1.48. ./ --prefix=/usr/local and sudo ./b2 ... the build process is extremely time consuming, and I had to manually install both the boost headers and the compiled libraries.

Next, openmp1 1.6.0 -- NOT 1.6.5. ./configure --prefix=/usr/local/ seemed to go OK, although it seems to run recursively on sub-projects, so it takes a long time, and creates hundreds of makefiles. Wow. Then sudo make install... so much stuff. My 8 cores are not really that much help, and don't seem to be busy enough. Maybe an SSD would help keep them stuffed. Where's my 2013 Mac Pro "space heater" edition, with a terabyte SSD? (Maybe when I get some income again...)

Finally, ./configure --prefix=/usr/local/ in repasthps-1.0.1, and make succeeded. After about 4 hours of messing around with broken builds. I had a lot with build issues for individual components and final problems with Repast HPC itself despite everything else building successfully, before I finally found this e-mail message chain that had some details about the API changes between different versions, and laid out a workable set of libraries:

They suggest that these versions work:

drwxr-xr-x@ 27 markehlen  staff    918 Aug 21 19:14 boost_1_48_0 
drwxr-xr-x@ 54 markehlen  staff   1836 Aug 21 19:19 netcdf- 
drwxr-xr-x@ 26 markehlen  staff    884 Aug 21 19:20 netcdf-cxx-4.2 
drwxr-xr-x@ 30 markehlen  staff   1020 Aug 21 19:04 openmpi-1.6 
drwxr-xr-x@ 31 markehlen  staff   1054 Aug 21 19:28 repasthpc-1.0.1

And that combination did seem to work for me. I was able to run the samples (after changing some directory permissions) with:

mpirun -np 4 ./zombie_model config.props model.props
mpirun -np 6 ./rumor_model config.props model.props


Notes on building Boost 1.54: doing a full build yielded some failures, with those megabyte-long C++ template error messages. I had to build individual libraries. The build process doesn't seem to honor the prefix and won't install libraries anywhere but a stage directory in the source tree. I had to manually copy files from stage/lib into /user/local/lib and manually copy the boost headers. There is an issue with building mpi, too:

./ --prefix=/usr/local/ --with-libraries=mpi --show-libraries

sudo ./b2

only works properly if I first put a user-config.jam file in my home directory containing "using mpi ;" Then I have to manually copy the boost mpi library.

Notes on bilding netcdf-cxx4-4.2: I had to use sudo make and sudo make install since it seems to write build products into /usr/local/ even before doing make install (!)