We had the annual awards dinner for the Writers of the Future contest the other night. I always enjoy getting together with my fellow judges, most of whom are old friends I don't see very often any longer. Fred Pohl couldn't make it this year and neither could Annie McCaffrey, but our table at the dinner had Eric Flint, Mike Resnick, Anne's son and collaborator Todd, and some rather charming former WOTF winners. As usually happens when working writers get together the conversation turned to business and contracts. I suppose there are writers who get together and talk about literature and characterization and other literary matters, but I seldom meet any. Mostly we talk business, generally to the disappointment of the newcomers who haven't been disillusioned yet.
Mike Resnick has been active: he found that he owned the electronic rights to a number of his early works, and that it was no great trick at all to have a commercial service scan in his older stuff and get it into Amazon Kindle format. It's also easy to deal with Amazon to get the books posted. If you let Amazon set the price, the author keeps 70% of any sales. I knew all of this, but the surprise was that in the three months Mike has been doing this he has been selling a couple of hundred books a month. Not much compared with the big advances publishers used to pay for a new novel, but hardly to be despised given what appears to be such a small amount of work involved. In Mike's case many of his early books are still "in print" according to the publishers and thus haven't been reverted, but it doesn't matter: the electronic rights were never owned by the publishers, and the Amazon Kindle publications are non-exclusive.
I came home determined to get my early books up into Kindle format, and discovered that that Baen already has a number of them in eBook format, so I need to look into just who owns what of my older stories, and what's exclusive and what isn't.
One thing I know I own are the rights to my old BYTE columns. Of course many of them now exist only in print, or on 8" floppy disks - that's not an error. Many remember 5-1/4" floppies which were actually floppy, but the "mass storage" back in S-100 buss days was a pair of 8" floppy disk drives. The drive box was as big as the computer. These wrote to floppies that held a whole 64 Kilobytes per disk. That was soon upgraded to 128K.
Those early S-100 systems were 8-bit, meaning that they could address a total of 64 K of memory. Since an English word averages 5 characters plus a space, thus 6 (8-bit) bytes per word, a 128k floppy would hold about 20,000 words. It took a box of floppies to hold a novel. Alas, the old 8 inch floppies are pretty well unreadable now. The magnetic charges tend to degenerate. No one has 8-inch drives anyway. Mine went to the Smithsonian along with the rest of old Ezekiel, my first computer.
I do have some archives on 3.5" Magneto Optical disks. MO recording is through a state change, so the information is in effect eternal. The problem is that the drive is a SCSI 1 drive, and I don't have a SCSI card or any USB to SCSI 1 conversion cable. I suppose I should look harder; I don't really know what is on those MO disks.
I also have a number of 5-1/4" rigid WORM disks, but I don't know what's on them. I was enthusiastic about WORM - Write Once, Read Many - drives, so I may have everything I ever wrote on that stack of disks. WORM disks held a lot of data, and were essentially eternal being a form of Magneto Optical technology but writing to much larger disks. I have about a dozen WORM disks that may hold everything I had on the old 8-inch floppies. Of course I can't read them. I still have the drives - they are about as big as a shoe box - but they, too, are SCSI 1, and I am not at all sure about drivers. At some point I suppose I ought to get energetic about finding a way to see just what I have on those disks. Some of it might be useful.
Of course I have a number of files that I transferred from hard drive to hard drive as I updated each machine; and now there is "backup to cloud" software that many use. The moral of this story is that no medium is really eternal, because the mechanisms for reading the data may go out of style, then out of production, then out of existence.
The major problem with getting one's older works into electronic print is that few have electronic copies of the books. Having them scanned and edited can be expensive - not terribly so, but the costs can be significant, and usually involves destroying a printed copy so that it can be fed to a scanner. Multiple scans with comparisons can get copies arbitrarily similar to the printed copies; of course the printed copies will contain some typographical errors, so the author probably ought to do a final scan before reformatting for Kindle and iTunes publication. The temptation of course will be to do revisions: I don't know, but I suspect that "Revised edition" with a new introduction by the author would increase sales.
Norman Spinrad has discovered Amazon and iTunes publication, and tells how he found clean copies of many of his works on pirate sites. Eric Flint has often said that pirates are more helpful than harmful; many authors have disagreed, but this is a clear example of how some pirates can be useful. Of course in Norman's case there were pirates who undertook the work as a labor of love because they wanted to see his books in print, and they took care to do quality scans; many pirate copies of books are horribly done.
Meanwhile, Eric Pobirs and Peter Glaskowsky have been busy finding me clean copies of my fiction and some non-fiction, and I'm going through those to see what I can come up with for Kindle publication. Eric is also involved in getting my old BYTE columns into shape, at which point I'll select the interesting ones that can be bundled up with modern day comments. I don't know whether that will be a useful project or not. By useful I mean resulting in something that will sell enough copies to be worth the effort. There are times when I wonder why anyone would be interested in the early days of the computer revolution, but some of them were pretty good, so perhaps so. And of course at a 70% royalty it doesn't take more than a thousand sales to make publishing an already written book worth the effort for all but the most successful writers. The average advance (generally all it will earn) from a major publisher for a first fiction book is about $5,000. Of course Niven and I get more, but then our books are a lot more work than just collecting already written material and adding comments.
My friend Francis Hamit has invested considerable time into learning about the new publishing game, and I had more to say about this last month. The story is still developing, but it's pretty clear to me that electronic publishing is going to be very important to writers in the future.
Leo Laporte recently pronounced Microsoft "The Comeback Kids." I can see why. In Vista times I was ready to convert Chaos Manor from Windows to Mac, and I was part of the way there when Windows 7 came out. Windows 7 made Windows Good Enough for nearly all the operations important to Chaos Manor, so the pressure to make the conversion was lower. Office for Windows improved greatly, and while the new ribbon interface was beastly for long term Office users to learn, it was learnable - and remains superior to Word for the Mac. Were I starting from scratch I'd probably be a Mac convert, but I'm not starting from scratch. I have a lot invested in Windows, particularly in the combination of Outlook and Front Page, and I don't see any combination of Mac programs that would make it as easy for me to handle mail, record subscriptions, cut and paste mail into my web pages, insert images and illustrations, and compose essays both original and as comments on the mail. Perhaps there are such, and I know some have been suggested to me, but I haven't tooled up to try them yet - and Outlook plus Front Page in Windows 7 really are Good Enough.
It's the same with phones, but in the other direction. Many respectable sources tell me that Apple's day in the sun with the iPhone may be coming to an end. Leo Laporte, an early iPhone enthusiast, is now a convert to Android as the smart phone, and is looking forward to the Microsoft Comeback Kid's upcoming Mobile Operating System. The Windows 7 smart phone isn't out, of course, but the SDK is available to developers, http://www.lanewsmonitor.com/news/Microsoft-Windows-Phone-7-Shuns-CDMA-1284810355/ and excitement has been running high. I haven't seen it in operation yet - I missed a couple of opportunities because of previous commitments - but those who have are excited.
Apple was down and nearly out until the iPod and iTunes restored their fortunes. That was followed by the new Intel Mac, all developed and integrated and marketed with genius by Jobs and his associates. Apple rose from the near dead to be a real contender, and the iPhone was wonderful, the closest thing I had ever seen to the pocket computer Niven and I first described in The Mote in God's Eye back in 1974.
The Apple dominance in mobile phones and pocket computers may be waning but the runaway success of the iPad is a big and very significant exception: not a phone and not quite a pocket computer, but certainly a success, as discussed elsewhere in this column. Leadership in smart phones for smart users is expected by many to pass from Apple to Google soon (if that hasn't happened already), and now the new Windows 7 telephone, redesigned and rebuilt from the ground up, may challenge Google. If course all that depends on what you mean by "leadership". Apple doesn't lead in sales, but that's not their goal. Apple means to be the BMW of smart phones, not the Chevrolet or even the Cadillac.
Windows 7 Operating System, the coming Windows 7 smart phone - and now the beta of Internet Explorer 9 is out. I still use Firefox, but Explorer 9 is really cool. So is Google Chrome.
All of which is vague. Many say, Apple's dominance may be waning, Windows 7 telephone may challenge Google - but really that's all I can say just now. Apple is very good at keeping its secrets, while Microsoft has a long reputation for generating buzz for future products. It's hard to make a story out of rumors - but it isn't wise to ignore these trends either. Our hardware is good enough to support vast improvements in features and performance of our small mobile computers. We know where it's going: I wrote about all this in The Mote in God's Eye. The question is how much of that capability will we get, and when, and from whom, and on that there's just not enough information.
What we do know is that Google is racing hard, Microsoft is back, and Apple has competition. Exciting times.
The main reason I haven't got the new iPhone 4 is that I am disgusted with AT&T, which does not seem interested in providing coverage in Studio City. It's not as if they can't. When this was Cingular I had good coverage in my house and in the surrounding area; but when AT&T and Cingular merged, the service went down to one or two bars out on my balcony, and even with the MicroCell it's never very good.
I keep informing AT&T of this, and nothing happens. Were it not for the egregious AT&T service in the Studio City triangle I would have a much better opinion of the iPhone. Alas the blackout extends for many blocks from my house, down to within a couple of blocks of the AT&T store on Ventura at Whitsett, so that when I am out for my morning walk I cannot use my pocket computer to do anything that requires an Internet connection.
There was a time when I'd probably just jailbreak my iPhone and go on from there, but I'm not in a mood to be an experimenter when it comes to telephone service and use of my pocket computer.
The AT&T/iPhone pairing is the best argument against buying an iPhone. A pox on AT&T.
One friend comments on jailbreaking the iPhone:
I did this with my old iPhone 3G after getting the iPhone 4. It took about ten minutes. The 3G is now happily running on a prepaid T-Mobile account as a backup.
The basic pay-as-you-go T-Mobile service cost me $20 to set up and will cost another $40 per YEAR assuming I use the phone less than ten minutes per month on average. (So far I've averaged about zero. :-)
For normal use, T-Mobile has normal contract-based plans too.
But just for the record, I don't think T-Mobile's coverage is much better than AT&T's.
A few years ago the big communications problem was the last few hundred feet from the backbone to houses. One solution to that was wireless, and there were a number of local wireless networks that gave high speed Internet access in places that the big phone and cable places did not. Over time the big guys provided Internet access, and the wireless nets closed down - after which there was political lobbying to restore some kind of monopoly status to big providers, and other political maneuvering.
One answer to the Net Neutrality movement is that wireless will provide competition for landline and cable Internet access, so regulation is needless. I'd have thought that competition would already have brought better wireless connectivity including telephone service to a choice market like Studio City with its fairly large component of successful writers and other entertainment Industry techies. So far that hasn't happened. We'll see.
The local chapter of the National Association of Science Writers organized a short tour of JPL, and I realized I hadn't been out there in a long time so I signed up. I used to go to JPL several times a year, back in the glory days of Pioneer and Voyager and Mariner when the spacecraft were giving us our first looks at Mars, and Venus, and Jupiter and beyond. Those were exciting days, and scenes from JPL were in most of my writings of that time. Niven and I put a scene from the first Saturn encounter in Footfall, and of course I wrote up the planetary encounters in BYTE and Galaxy and my National Catholic Press Twin Circle science columns. You can find some stories from that era in A Step Farther Out.
I was surprised to find that our tour was conducted by Mark Griffin, who, I had last seen when he was my Senior Patrol Leader back in the days when I took the Boy Scouts on fifty mile hikes into the High Sierra.
The clean room at JPL
Closeup of the Mars Science Laboratory
Things have pretty well changed at JPL since the days when Niven and I set scenes there for Lucifer's Hammer and Footfall. The Von Karmann Center, which was the big building in which JPL housed the visiting press and held press conferences during the encounters, is now a museum. There's not enough action to sustain a big press center any longer.
On the other hand, they have opened viewing galleries to the clean rooms where they are building the spacecraft, and to the Deep Space Network control room. There are also various films from the old days, and more about new projects, and that's all very much worth seeing.
The photographs show the clean room at JPL, where the Mars Science Laboratory is being assembled. The Mars Science Laboratory is big. Mars Rover was about the size of a big breadbox. Spirit and Opportunity were smaller than a breakfast table. The Mars Science Laboratory is the size of a Humvee.
It was an interesting tour, and the Mars Science Laboratory will tell us a lot about Mars. Alas, for now at least, it's the last big Mars rover we can build. Previous Mars rovers were powered by solar screens, and thus were restricted to daytime operations in equatorial latitudes. Spirit and Opportunity are separated by only about 10 degrees of latitude, but that is enough to make it necessary for Opportunity to sleep during the Martian winter; there's just not enough solar power in daytime to keep her going.
The Mars Science Laboratory will be powered by a Radioisotope Thermonuclear Generator (RTG), which is a fancy way to describe a can of radioactive stuff decaying away and generating heat as it decays. A number of radioisotopes will do to fuel an RTG, but for a number of reasons the best is Plutonium 238. PU 238 radiation is easily shielded, the half life is long enough that it will generate heat for a long time, and it has been used in many spacecraft so we understand it well.
Alas, there's no more PU 238 available for use in RTG's. The Mars Science Laboratory will use the last we have. There's plenty more sitting under water at local storage facilities of nuclear power plants, but the US no longer processes spent fuel and recycles it. If we build any more spacecraft they'll have to be powered with something else.
There's no shortage of PU 238 in the United States, but there is a shortage of the political courage to open up the spent fuel reprocessing centers which were highly successful, and were shut down for political reasons.
Eric Pobirs recently sent me a copy of the Computing at Chaos Manor column from March, 1986. This has always been The Users Column, but in those days one of the things computer users did was experiment with programming, and one of the continuing debates was on computer languages. Those were the days before C took over the programming universe.
I found the column fascinating, and I'm thinking of putting together a bundle of the best of the old columns with comments. Meanwhile, I find that in 1986 (actually in November, 1985; there was a 3-month lag between my finishing a column and its appearance) I wrote:
Gordon Eubanks is a former Navy submarine driver. He's been in the microcomputer revolution since the beginning. He wrote the first compiled BASIC -- public-domain EBASIC -- while a professor at the US Navy Postgraduate School in Monterey, California. After retiring from the Navy, he turned E-BASIC into CBASIC which, despite my enthusiasm for Modula-2, is still the language I've used for all my large and important programs. Eventually, Gordon's company was sold to Digital Research, and Gordon became Digital Research's vice president in charge of languages. About a year ago, he left to run a new outfit known as Symantec.
Symantec is seriously trying to apply artificial intelligence (AI) principles to business programs. The first result is an integrated package known as Q&A (see the Product Preview in the January BYTE, page 120), and just before my monster party, Gordon wanted to come down and show it to me. I don't usually let software publishers find out what city Chaos Manor is located in, much less invite them to my house. But Gordon isn't a publisher. He's a hacker from the early days, and Lord knows back in those days I bent his ear enough about problems I'd found and features I wanted in CBASIC.
I went on in that column to talk about Q&A, and in fact after a few months Niven and I adopted Q&A and its text editor Q&A Write as programs of choice, and we wrote a number of books using that software. Q&A with its intelligent assistant was the best data base program of its day. Alas it didn't survive the transition from DOS to Windows; but it was good enough that I put off converting to Windows for more than a year because I didn't want to give up Q&A and Q&A Write. Eventually video processor chips got powerful enough to allow Word to refresh a screen in Windows without taking long enough that you could go make coffee while waiting, and as often happened in those days improving hardware made Microsoft software not only usable but the best stuff for the job, and Office took over driving Q&A and Q&A Write out into the cold. The Microsoft machine ground slowly, but it ground exceeding fine, and Symantec just didn't have the resource to keep up. I regret that because Q&A was a really innovative program, easy to use, efficient, and in some ways better than anything we have today. But that's another story.
The point is that we used to debate languages in those days. For example, later in the March, 1986 column I wrote:
Logitech has unbundled its Modula-2 compiler; you can get it for $89. What you get is the full compiler with integrated text editor. It generates native code for IBM PCs and compatibles. The code is certainly comparable in speed and efficiency to the best PCompatible C and Pascal compilers I've seen.
I've said it before, but there's no harm in saying it again: if you like Pascal. you'll like Modula-2, and, moreover, you won't have much trouble learning it. Most Pascal source code can be translated into Modula-2 by means of programs written in Modula-2.
The main advantage of Modula-2 over Pascal - and darned near any other language - is the total independence of the modules. In Modula-2, no matter what it is, if you didn't explicitly import it, it can't affect what's going on inside the module; and if you don't export it, it can't affect anything else. The result is that you can build up library after library of small modules and never have to worry about variable names. (Who cares if you have 400 different counting variables called i? They can't affect each other if they're in separate modules.) You can also hand someone else the definition module describing what your code does, let that person write code to mate with yours, and be secure in the knowledge that nothing your partner does can have side effects inside your own modules.
Flash: Workman and Associates have a CP/M Z80 Modula-2 compiler they call FTL Modula. It's less than $100, fast. and you get the source code to its integral editor.
Modula-2 has been getting theoretical applause for years: what it has needed was a good low-cost compiler for a popular machine. Logitech has remedied that defect and is to be congratulated. If you hack, try Modula-2. Even if you don't like the language, you ought to know something about it. If you haven't done any hacking, here's your chance.
Those discussions were frequent during the transition from CP/M to DOS to Windows. They are not now, and I think that's a shame. We have now conceded the contest to C, and the problem with C is that it's a specialty. It takes time to learn it. Worse, its structure is such that there are many ways to accomplish a given task, and some are very clever - but incomprehensible to anyone but the programmer who did it that way. C programs are notoriously hard to understand even by those who wrote them. The C compiler will compile almost anything including nonsense, and sometimes that is hard to detect; the result is that C programs are often written swiftly and run fast when released, but the debugging takes as long as the program composition.
The alternative is to use languages that are highly structured and much easier to understand. The logic of programs written in languages like CBASIC and COBOL is much easier to understand and follow than even the best commented C programs. CBASIC and COBOL were early languages and were not highly structured, but later versions contained type and range checking, as well as local and global declarations. Pascal and Modula-2 carried those principles further. Programs written in structured languages required programmers to follow certain procedures and restricted their creative use of odd features in the machine instruction set. Marvin Minsky (a LISP programmer; LISP was generally outside the C vs. everything else debates) said that adopting a structured language was akin to putting on a straight jacket. Many C programmers felt the same way.
Moreover, structured language programs took a lot longer to compile than C programs. You could do several compilation and correction cycles of a large C program while a Modula-2 program was compiling. Structured language advocates said that it took longer to write programs that would compile - the compiler was very fussy - but that was the point. When you did get the program to compile it did what you expected. Unlike C, you didn't have to simulate the compiler in your head each time you wrote a complex statement. Indeed, structured compilers tended to forbid cleverly complex statements; you had to break the action down into comprehensible components. The result was far less need for extensive debugging. C programmers said, yes, and by that time you'd be too old to need the program.
For a number of reasons, C won out; but I am not sure that is a good thing. I would like to see the debate reopened. It is time for the computer using community to adopt an easily learned and highly readable programming language. The real computer revolution will come about when it is possible to teach computers to do interesting things without having to learn a complex programming language.
Bob Nealy is a long time Mac user and a recent convert to the iPad, and now carries the iPhone 4 everywhere, and the iPad - and no laptop - on business trips. He was recently at a meeting of the CIO's of a number of major Silicon Valley firms and during a lull in the meeting asked how many in the room had iPads. That turned out to be over 90%. He then asked how many carried the iPad and no laptop on trips. That turned out to be about 40%. One of the 40% was the CIO of Apple, which was no surprise.
I have other stories from readers, all so far long time Mac users, saying they are now carrying only the iPad on fairly long trips. One of my friends is an Internet security specialist stationed in Singapore who takes frequent trips all over Asia and Oceania, and he often carries only his iPad, iPhone, a Bluetooth keyboard, a stand and a pocket full of thumb drives with programs and data. He loves the lightness, thinness, long battery life, and simplicity of the iPad and iPhone, and says he can do about 90% of what he needs to do for his job using those tools alone.
His primary gripe with the iPad is that when he uses Apple's iPad version of Keynote to do a presentation via the iPad dongle hooked into a projector, Keynote apparently stops listening to Bluetooth, and so he can't use a wireless remote (he's a pretty energetic speaker who paces around and gestures a lot). He's hoping that Apple will fix that in an upcoming release of Keynote.
His other complaint is the lack of multi-tasking on the iPad; he currently solves this via making use of various utilities which allow cut-and-paste between applications via the cloud (he recommends Instapaper), and making use of his iPhone - which itself now multitasks with iOS 4 - as an auxiliary device. He's really looking forward to getting multi-tasking on the iPad when Apple's iOS 4.2 is released next month - if Apple will only get Bluetooth remote support for making presentations, he'll be a happy camper, indeed.
I am not yet up to carrying only an iPad, largely because I still use Outlook to handle most of my mail, but I understand the temptation. I have carried the iPad on all my recent trips, and it's the only computer I take out of the carry on when I'm on an airplane. I can use it to read books - I prefer the iPad to the Kindle for reading Kindle editions of books - make notes, write scenes, and generally do about anything I'd do with a laptop on an airplane. I don't generally watch movies while on trips, but it works well for that, too.
I'm sure there will be new editions of the iPad, and competitors, and the direction will be toward the Tablet PC with handwriting recognition and stylus editing of documents. I was and remain a TabletPC enthusiast, but apparently they required more user sophistication than we had thought. If the evolutionary path is from iPad ("You already know how to use it!") that may not be a problem. We'll see.
Google Analytics (O'Reilly) by Justin Cutroni is, unsurprisingly, a book about the Google Analytics tools available free from Google. The tools allow you to analyze just how people interact with your web site. This book will show you how to do that - provided that you have some programming experience and a basic understanding of what it is you want to do. In other words, this isn't an elementary or introductory book. I'd say it is not the place to go if you've no idea of what web analytics does or why you'd be interested, but alas, I don't have a better suggestion, either. Learning about Web Analytics in general and Google Analytics in particular - what you can learn, and what you have to do to learn it - is a project with a very challenging learning curve. You can do it, and this book will be important to getting it done, but I don't think you'll find it an easy job. With that understanding, Recommended.
I wish everyone involved in medical and pharmaceutical research would read Intuitive Biostatistics: A Nonmathematical Guide to Statistical Thinking by Harvey Motulsky (Oxford University Press) . That would likely save us all a great deal of money, and cut way back on nonsense promulgated as new science. You can find sample chapters and author corrections at www.intuitivebiostatistics.com.
Statistical thinking is a difficult subject. The problem is that it can look simple. Many university departments have their own statistics courses for majors. The theory is that statistics is difficult, but by concentrating on the parts relevant to one's discipline, say psychology or sociology or medicine, much time can be saved at little cost. Thus we have specialized courses that mostly consist of cookbook statistics: algorithms for calculating mean, median, standard deviation, correlation coefficients; how to do analysis of variance (almost always called anova, and usually guaranteed to find something significant in any given mass of data); and other such practices. The problem is that these course seldom teach any understanding of the assumptions behind the statistical models the students learn to generate. There is often no mention of Bayesian analysis and its assumptions and conclusions. Mostly the students learn to look for "significance" and generally find it.
This can be costly. Some of us remember the J. B. Rhine experiments that supposedly gave statistical proof of extra-sensory perception. Students all over the country carried Rhine cards: a deck of cards each marked with a single symbol. Those included wavy lines, a star, a circle, and other such easily identifiable marks. The sender shuffled the cards, and looked at them one at a time without showing the face of the card to the receiver, who guessed what symbol was on the card. The guesses were written down, then compared to the actual order of the cards. Usually the results were about what you'd expect, but once in a while the receivers would be uncannily correct, and sometimes the same receiver would be correct several days in a row. By correct we mean the results were significantly better than chance.
Of course there were tens of thousands of such experiments. The experimenters were unaware of this, so when they got results that could happen only once in a thousand times they grew excited, although given the number of experimenters such results were inevitable. Eventually that was understood and experiments intended to demonstrate telepathy are better designed. Alas, much the same thing happens with a number of medical experiments.
The problem is that the statistical models used in determining the "significance" level of a result involve assumptions that are often not met in the experimental design and this is because the experimenter hasn't been taught much about statistical inference and its assumptions.
Intuitive Biostatistics takes a different approach. There is very little about cookbook techniques in this book; instead it concentrates on the underlying assumptions, and just why some inferences are valid and some are not. It's not easy going, but statistical inference isn't an easy subject. Alas, now that computers have made it easy to do the cookbook work to compile all kinds of statistical number, many think that the subject is easy, and computers can do all the work. Intuitive Biostatistics will disabuse them of that error. The book uses many examples from pharmacology research. Recommended.
Step by Step Office Home and Student 2010 by Joyce Cox, Joan Lambert, and Curtis Frye is another of the Microsoft Press Step by Step series. Like all of the series it's a pretty good practical introduction for those who need to learn what's in Office. There are chapters on Office features including OneNote that are in the Home and Student editions of Office 2010. The "step by step" organization of the book helps those learning the programs. That tends to get in the way of using the book as a reference handbook, but the interference isn't critical. This is a pretty good first book for those who have poked around with Office but now find they're going to have to learn it in a more systematic way. Microsoft Office "just growed" from its beginnings up through Office 2003. Office 2007 attempted to integrate the parts of Office and make a more standardized control system built around "the ribbon". This was frustrating for a lot of us who had learned Office by using it, and now found that many of our favorite tricks no longer worked. Over time I have become accustomed to Word 2007 and now use it, and I will convert to Office 2010.
One definite improvement in Office 2010 is the OneNote interface, but I have reports that the interface improvement is at the expense of functionality: that some of the features that make OneNote 2007 so useful don't work properly in OneNote 2010. I'll be looking into this for a later report, but You Are Warned.
OneNote isn't the most intuitive program in the world, and it tends to be most useful in conjunction with a TabletPC. Even without a Tablet, OneNote, once mastered, is one of the best tools for doing research and collecting data I know of; at least that was my experience with OneNote 2003 and then OneNote 2007. I wish I had had OneNote or something like it when I was an undergraduate, or for that matter in High School.
Those who use Office 2010 a lot may want to look for Nancy Connor and Michael MacDonald's O'Reilly work, Office 2010: The Missing Manual as a better organized handbook; while those who aren't so familiar with Office will probably prefer the Step by Step introduction to get acquainted with its features. Microsoft Office is a powerful tool set that rather quietly gets better all the time.
Plain and Simple Office 2010 (Microsoft Press) by Katherine Murray is an even simpler introduction to the Microsoft Office Suite, and not being confined to Home and Student Office, includes a short chapter on Microsoft Publisher. Those planning on using Office for fancy publishing, on the web or in print, will probably prefer this edition.
With all the books intended to introduce users to Microsoft Office, and in particular Microsoft Word, the astonishing part is that there is nowhere - at least nowhere I can find - a simple step by step instruction with examples on how to set up a very simple format for fictional documents for publication. I have been using Microsoft Word since about 1990 when Q&A Write didn't work properly for Windows and at the same time Chris Peters, then the Microsoft Word Czar, offered to add certain features to Word that Niven and I wanted as an inducement to get me to use it and say so in BYTE. (He did put in the features, and we did convert to Word, and I don't regret it.)
For most of my life, my work has consisted of writing and proof reading text documents, sometimes with embedded tables and illustrations, but when that is all done the work goes off electronically. Publication format and layout are done by the publishers. It hasn't been my problem, and indeed would be a work of supererogation. Whatever work I might do in formatting and layout wouldn't affect how the publisher set up the work. The result is that I haven't paid much attention to how that is done.
Now I need to learn, and I haven't been able to find out despite some smart people trying to teach me by email. What I need is a simple template for a fiction novel. The novel will consist of a number of sections, which I don't mind setting up by hand (Dramatis Personnae, Table of Contents, "What has gone before"), and the main body of the text which will consist of several parts or books, each of which is divided into chapters. Each chapter starts after a page break, and has a Chapter Number, a Chapter Title, and an epigraph. I don't insist on a format that includes an epigraph; I can add those by hand. But it does seem reasonable that there is a simple way to tell Word that the next page starts a new chapter, and have word keep track of the chapter number while asking me for the chapter title; after which it lays that out in a preselected format and changes the page header. I'd also like a way to tell Word that a "Part" or "Book" section is coming up. The "Part" or "Book" section page consists of the Part or Book number, the title, and an epigraph, and ends with a section break that opens a new Chapter One (which is to say the chapters renumber from one after each "Book" break).
That all sounds complicated but it really isn't, and I am sure that hundreds and thousands of people do that. I am also sure that if I dig in hard enough into the various books I have I will be able to figure out how to do this using the "Styles" features that have been part of Word for decades; but alas, when I do I find it tells me that Word has a number of Heading styles. There are styles for lists, quotes, references, paragraphs, but nothing obviously points to a "style" of automatically numbered chapter headings with chapter titles.
The odd part is that of the plethora of books I have for review, not one tells how to create that. In fact not one has an example of a simple document with numbered and titled chapters and automatic generation of a Table of Contents. Given that example it would be simple to set up the system I want. Since I'd suppose that most fiction uses numbered chapters with or without chapter titles, the failure of any of these books to provide a simple model is astonishing. I'm sure that it's something to be done with Headings, but alas, the word "Headings" is not prominent in any of the indices of these books. It's a puzzlement.
Since I have written at least a million words in Word you'd think I would know all about it, but when it comes to setting up a format for a new chapter page I just didn't seem to get it.
Fortunately I have many readers who do get it.
Bruce Epstein suggests that what I really need is
File-->New-->Templates-->More Templates-->Book Manuscript
There are all kinds of other templates for Newsletters, etc.
Powerpoint also has some reasonable templates for "Presentation to Persuade" etc.
And John Zukowski tells me that
This link seems to be a template that meets your needs.
You can search online for more templates here
I have other mail, and I am certain this will do the job. I continue to be amazed that I was able to search a dozen books and wasn't able to find an example or clear explanation. I suspect it's because I learned a lot of things early and never lost my bad habits? Terminologies have changed, and it's hard to unlearn something you've been doing for a decade. Newcomers won't have those problems.
There's a new iPhone 4 edition of David Pogue's iPhone: The Missing Manual (O'Reilly). This book covers all iPhones using the iOS 4 software which introduces multi-tasking to the iPhone. If you're a heavy duty iPhone enthusiast, or think you might want to be, or you are contemplating getting an iPhone 4, this book is a good investment. Pogue explains things well in clear language. The iPhone has generated competition, but it's still about the coolest pocket computer out there with its gorgeous screen and a hundred thousand applications, and I can practically guarantee you this book will tell you about features you never knew about. Recommended.
Those contemplating writing applications for the iPhone will need Pogue's book, as well as Craig Hockenberry's iPhone App Development The Missing Manual (O'Reilly). The title is a bit of a misnomer, of course. The Missing Manual series purports to be "the book that should have been in the box: but what's the box that iPhone App Development came in? This is more of an intermediate level introduction to writing iPhone apps for programmers who have at least a nodding acquaintance with C++. They don't have to know about Objective-C although that couldn't hurt.
If you have an idea for an iPhone App and wonder what it would take to develop and publish it, this is a good book to start with. It will tell you what tools you need and how to get them, all free except for the powerful Mac you'll need to begin with, and there's a short section on getting your Mac if you don't have one. The book then tells you how to set up that Mac with the uninstalled X-Code tools that came with Snow Leopard but aren't automatically installed on a new Mac, the Apple iPhone SDK, and all the other stuff you need. The rest of the book is a systematic discovery of just what it takes to write an iPhone App. Some will be obvious to experienced programmers. Some will not. It's all clearly written with examples.
There's also a section on marketing, complete with advice on advertising strategies, which may be useful for beginners, but that's not really why you needed this book.
There are more than a quarter million iPhone Apps, some free, some expensive. Some have made their owners rich. A surprising number of app developers have made decent profits. Most don't get rich or even make wages, of course, and quitting your day job to become an iPhone App developer isn't a very good idea; but for those who can't find a decent programming day job in these economic times, learning to develop iPhone Apps may well be a useful investment of their time. Recommended.
The first book of the month is Rafael Sabbatini's Captain Blood. I am working on a sea battle scene and while doing a bit of idle Googling I discovered cut scenes from the Errol Flynn movie, which sent me looking for the book which I read long ago. I easily found a free eBook copy. Sabbatini was very good at historical novels. His research was thorough, and his Bellarion the Fortunate is one of the best depictions of early Renaissance government I have ever seen (as well as a whacking good story). Captain Blood is a tale of the twilight of the age of piracy. It's a bit romanticized, of course, but that's to the good.
My second book of the month was Poul Anderson's The High Crusade, a new Baen quality paperback edition of Poul's epic story of star-faring Anglo-Norman warriors. The story holds up very well.
The computer book of the month is Pogue's iPhone Missing Manual reviewed above. It told me things about iPhones I didn't know.
The game of the month is We Rule for the iPad. I found out about this while we were recording a TWIT (This Week in Technology) broadcast, and Leo Laporte took out his iPad to find out if his Magic Cauliflower crop was harvestable. I found that absurd and said so, and after the broadcast I downloaded the free New Rule app, and got hooked. I can't say why the silly game is addicting, but clearly it is. Incidentally, Laporte as Chieftwit has the most elaborate New Rule kingdom I know of, and it must have taken a lot of time to arrange it.
And Sid Meier's newest Civilization comes out shortly. There goes more time...