I Love MVP Summits aka Excel Services is Useful!

I love the MVP summit. Despite the fact that many of my friends are missing, and we had two throw away (general Office) sessions this morning, the Excel content today was VERY cool. VERY VERY COOL!

We decided to make our first session a trip to the Microsoft Company Store, then followed it up with an impromptu Excel session of our own. Roger Govier, Jon Peltier, Mike Rosenblum, Charles Williams and I sat around discussing issues with the charting engine in Excel. A lot of fun, and we learned some cool things from each other too. Very cool stuff.

And after lunch, we had our standard meet the Excel team session, which is always entertaining. I can’t disclose the content of it, but the Excel team banter back and forth as we discussed issues, wants, needs and such was good. I really enjoy the banter with the Excel product team, as I do believe that they really want to see the product be the best it can be.

After that, we had a session on Excel services. I figured that this would be better than the morning sessions, but honestly wasn’t really expecting to get much out of it. Excel Services is a Sharepoint thing, which we can’t afford, and has always left me feeling like yelling out “To the Cloud!” with disdain. It’s always been something that is totally out of reach of us who don’t have Sharepoint, and has always been useless to me.

Wow… things changed for me in that regard today!

Again, I can’t discuss some of what was in the session, but there is some stuff I can talk about, as it’s already live.

Amy (who led the session) wanted to demonstrate to us how useful Excel Services and Excel WebApp are. So she spent some time trolling the MVP websites to find an example of a website with a good page to demonstrate what she wanted to show… and she picked one of my articles. I have to say that was a little humbling, and it felt kind of awkward as she’s showing my page off to the collection of my peers sitting around me. The page picked was my “Five Very Useful Functions For Working With Text“, which has also just been the source of an article for CMA BC’s Update Magazine.

Amy mocked up a copy of my website page to demonstrate injecting Excel Services WebApp into my site in order to give an interactive experience with the user. Rather than look at pictures of spreadsheets and have to download the example file, she used the webapp components, stored on Windows Live SkyDrive, to let the users actually type in the formulas and display them in the webpage, like this:

 

“Okay”, I’m thinking, “this is neat, and I can see the use, but it’s going to be a pain in the backside to get it to work.” So then she kicked us into lab mode and gave us an hour to try out the webparts.

After a bit of futzing around, learning how the webparts interact with a worksheet versus a named range, and how hidden rows behave, I implemented the parts on my site. Including that play time, I had the article converted to be interactive in less than an hour. I’m floored. The article is now live on my site, and you can fool around with it!

This is very cool. The workbooks I used reside in a public folder on my SkyDrive so you can open them and play with them in Excel WebApp (click the icon on the bottom right of the control). You can’t save them on my SkyDrive due to the sharing permissions, but you can save them from SkyDrive.

For an article that tries to teach, like this one does, this is perfect. I’m definitely going to be making much more use of this feature, and will have to put up a blog post/article on how to do it. J

Mapping a Network Drive to a Local Folder

Tonight I had to do some work on my financial model from home. Naturally, I copied the files into my “work stuff” folder on my laptop hard drive, and headed out at the end of the day. But as I disclosed in my last post, it now has (shudder) external links in the file. So in order to work effectively on the model, I needed to replicate the file paths.

If I had a network at home, this would be easy, as I could map a network drive to S:\, upload the files there, and I’m good to go. Actually, I do have this, but this means that I’d have to upload the files to my server, work on them, then download them to take them back to work tomorrow. And I’m way too lazy to do all that. So instead I spent some time working on a better alternative. J

I now have a little VB script which maps the S:\ drive to my “C:\My Documents\Work Stuff” folder. I run the script, and Excel doesn’t have a clue that it isn’t working with the S:\ drive at work. No repointing of links required, not futzing around uploading files to my server.

Here’s the script, (I actually replicate the H:\ drive too) which makes use of the DOS SUBST (or Substitute) command:

  1. Dim objShell, strDrive1, strDrive2, strPath
  2. Set objShell = CreateObject("WScript.Shell")
  3. strDrive1 = "H:"
  4. strDrive2 = "S:"
  5. strPath = """C:\My Documents\Work Stuff"""
  6. objShell.Run "cmd /c SUBST " & strDrive1 & " " & strPath, 1, True
  7. objShell.Run "cmd /c SUBST " & strDrive2 & " " & strPath, 1, True

The trick is that I have to remove these mappings using the SUBST command before returning to work, or my regular network drive mappings won’t fire since SUBST is persistent. The script to do that is pretty simple as well:

  1. Dim objShell
  2. Set objShell = CreateObject("WScript.Shell")
  3. On Error Resume Next
  4. objShell.Run "cmd /c SUBST H: /d"
  5. objShell.Run "cmd /c SUBST S: /d" 

To make it work, just open a text file, drop the appropriate section of code above, and save the file with a .vbs extension. Double click it and the paths should map.

NOTE: The triple quotes on strPath in the first routine are required to feed a single set of quotes to the cmd. This is necessary to allow spaces in the file path.

Hopefully you’ll never find this useful, as that means you’re taking your work home!

Question on your modeling practices…

So I’ve been working on a massive model over the past year, and recently recorded a webcast on modeling practices. In the course I cover some of my philosophies for making sure that the model lends itself to attracting valid data, namely I colour all input cells green and tell my users that green means go. This is a practice that I’ve followed for many years, and it’s served us very well at work.

When working on my model though, just coding the input cells isn’t enough. I need to enforce different looks for different cell types, and for this I reached to styles to help me manage consistency across the workbook. My custom styles gallery looks like this:

All the DE_ styles are data entry cells. The Head_ styles are for headers, and the SYS_ styles are for formulas and text that I don’t want the users to change.

For a long time this worked out really well for me, but then I had to implement a change… I had to link to data in an external file. I try not to do that, and I need to be able to see this data in my model, but I didn’t have a style to do this. Excel 2010′s gallery has some default styles shown here:

As an accountant, I just can’t have a block of Linked Cells with double underlines all the way through. Double underlines are for totals, not accents. So I ended up building my own style for this too:

I think it’s equally ugly, but it does draw your attention to the fact that they’re pretty dangerous.

I’m just curious as to what conventions you use. Do you highlight links in any way? Do you draw attention to internal links (to other sheets) or only external? What other things do you try to draw attention to?

Working with Styles

If you’ve followed my twitter feed over the past few months you’ve seen that I’ve been working on a couple of pretty large financial models. These models give us the ability to change a large variety of inputs in order to predict our real estate development over the next 25 years (through the build-out of our remaining lands), as forecast the effect of the real estate sales on our operating divisions (golf course, marina and fitness centre.) To give you an idea of their scope, they consume over 170 pages of 11×17 paper when printed.

This project was actually pretty neat in that I was able to sit down with people, scope it out in full, and apply every best practice to it as I was building. I’m really proud of these, and even based my recent course for CGA off the techniques that I used in these works.

In the end, I built two parts of the finished model, and inherited a third piece that had been built by someone else.

One of techniques I used was to apply Styles to the model. (If you’re not familiar with Styles, then have a quick read of this article on my site.) As it turns out, that was a very smart move. Earlier this week I got the call that my headers needed to be changed, and I was provided an RGB colour scheme. In the model components that I built it took less than five minutes to update the Styles to the new colour scheme and the whole model was updated.

In the other model, the one I inherited, well… I’m still working on updating all the pieces. This time, however, I’m actually updating it with styles so that I’m not caught again if there is another change.

Updates to PDFCreator Articles

It’s way long overdue, but I FINALLY updated some of the PDFCreator articles on my website. The updates include:

  • Better handling to shut down PDFCreator before the code starts running
  • Better methods to tell when the PDF has been created
  • Better closing methods that shut down PDFCreator completely after running

In my experience, these modifications have led to much more stable routines that run as expected, or at least let you know if they haven’t. (I VERY rarely see that now.)

In addition, I also added a routine that allows printing specific worksheets in a file to a single PDF.

You can find the updates here: http://www.excelguru.ca/node/21

Recording Excel Videos

So this past Friday, I had the opportunity to go and shoot a couple of Excel webcasts that I’m doing for the Certified General Accountants. This was a pretty interesting experience, as I’m very used to doing live presentations, but I’ve never been recorded for later broadcast. The opportunity to work with a professional crew was pretty neat too, and I thought I’d share some of my experiences.

So the summary is this: Filming is like a day at the spa. It starts with a facial, and ends up with you getting your chest waxed.

The company that CGA hired to do the shoot was Blink Media Works in Vancouver. I can’t say enough about these guys. They were absolutely pro at helping me learn the methods they use to record video, and I’ve come to the conclusion that being a producer pretty much means that you’re half a technical/visionary, and half a motivational speaker. Arthur, our producer, did a great job at both pieces.

I was a little surprised at how many people we had involved in this. In addition to myself and the host, Blink had four people with us for the whole day:

  • Producer
  • Camera technician
  • Teleprompter technician
  • Makeup artist

I wasn’t surprised by the first three, but the makeup artist really surprised me. I’m not sure why, but I was expecting that you’d sort of be “done up” and that would be it. Not so.

As I mentioned above, it was like starting with a facial. Justin (my host from CGA) got to go first, then me. Cleansers, moisturizers, some base stuff to make your skin look good for the camera, and some powder to stop my forehead from shining. And apparently she didn’t really need to put much on me at all. After that, I got to watch the filming process at work as they recorded the introduction and outro for my webcasts.

And then it was my turn. I stood in front of the camera and they do a final check to make sure you look PERFECT on screen. I can honestly say that I’ve never had anybody spend so much time getting my hair to be absolutely right. And I’m not kidding here: they stopped the shoot to get one (yes one) of my hairs to go where it was supposed to. There is some serious pride in their work there!

After the makeup side, then came sound check. The camera technician hooked me up with a microphone, and then there was a lot of work to make sure that it was working well. They didn’t want to see it on camera, so it ended up taped to me under the shirt, then we had to make sure the shirt wasn’t rubbing on it or the sound went scratchy. And that’s enough to require re-shooting the scene/part.

And then we started with the actual shoot stuff. Breathe, loosen up, smile, be excited… this is where Arthur’s passion and motivational side came through. He is just like the classic director you see in a behind the scenes footage on a set. “Okay, we’re starting from here. Loosen up, deep breath, and when you’re ready… take it away KEN!”

I’ve got to hand him some real kudos here too… I’m a pretty skeptical and practical guy overall. I don’t do relaxation well, but Arthur worked really hard and was really patient with me as he was getting me to “connect” with the camera. With people it’s a lot easier. You can see the social cues, read the tone and see the questions in their eyes. Here you have a lot more room for self-doubt and self-consciousness. It took a while (far longer than I would have liked,) but eventually I hit my groove and we were able to record 5-10 minute segments without hearing “CUT!”

A couple of quick observations about film:

  • They get you to smile… to the point where it feels campy and ridiculous. The whole time they tell you that it will come across as natural on the screen. While I haven’t seen my own film yet, I watched Justin’s and would have to agree that this is the case. He mentioned that he felt like he was overdoing it, but he came across really natural on screen.
  • The producer sits behind, but just to the right of the camera. Letting your eyes flit to him for even a split second is enough to blemish the recording though. It is noticeable on film.

I can say that the hardest thing about this shoot to me was working with the teleprompter. To be clear, this is in no part due to the technician, but rather due to my understanding (or lack thereof) of how they work. I prepared my script as I usually would… I kind of figured that the teleprompter would be similar to the binder I usually have… a full 8 ½ x 11 sheet of paper. Boy was I wrong!

Here are the difficulties I ran into:

  • Take your full size sheet of paper that has your script outline on it and put a pad of Post-It notes on it. That’s how much of your note page you really see. It left me feeling constrained and boxed in… claustrophobic. I couldn’t see what was coming any more.
  • In my presentations, I build my slides with the intention of talking around the points and fleshing them out more. It’s a real battle not to just read the teleprompter though! And with losing the peripheral vision based on the above, it makes it hard to ad-lib.
  • The teleprompter had no formatting at all. No bold, no colour, no underlines, nothing. It’s just text that is all in the same size. So where my slide headers in my script were Bold and in another column, they were now in the teleprompter as normal text, the same size as everything else. This is really hard to follow, as I lost track of which were slide titles, slide points and notes that I wanted to talk about. Part way through we stopped and put in a line of asterisks before each slide point, just so that I could recognize where I was. This made it a LOT easier to transition between points and keep the flow going.
  • Later, when I was getting more comfortable, I did start to expand on my points and go a bit off script. This was great, as it came out very natural, but it had issues as well. A couple of times I read a point from the teleprompter, expanded on it, then read the next point on the prompter. At that point the technician moved the prompter up and I realized that the next point was the one I had just expanded on… doh! And being that you’re on camera, there is no way to signal the tech to move the prompter up without it being caught on film. Again, an issue with having such a small window into the presentation.

Despite these issues, we got into a real groove later in the day and things sailed along pretty smoothly. I can certainly say that I’ll do a lot more work on my scripts next time. Things I’m thinking:

  • More bullet points and less sentences in the script
  • Shorter bullet points so they fit on one line (to get more of them on the screen)
  • I’ll put in my own asterisks or come up with another was to indicate slide title or points. (Maybe ST-Slide Title Text and SP-Slide Point Text.)

The next thing that was kind of weird was that we shoot out of order. I totally expected this, but it was still odd.

In the morning we shot all the video that accompanied my PowerPoint slides. I talked about all the points, and the editors cut the slides into the video stream as I talk about them. The challenge was that the two webcasts had a bit of overlap in them. It was tough to remember what you’d said after two or three cuts, let alone when you’re doing an overlapping slide from a second webcast and you’re on your 2nd cut there.

In the afternoon we did the computer portion. We hooked up Camtasia studio (thanks TechSmith!) to record both the audio and video of my Excel work. In addition, the camera kept rolling to record video and audio. The editors will use the audio and some of the video footage from their shoot, and cut it the excel video recorded with Camtasia. The only reason we recorded audio in the Camtasia side was strictly so that the editing department knew how to overlay the better quality audio with the Excel portion.

And then, almost as quick as it all started, it was done. We ended off the day and it was time to go… at least, once the microphone was taken off.

The camera technician was finishing up labeling film I think, so I decided to take care of that bit myself. I reached into my shirt, grabbed the tape he’d used to secure the mic, took a couple of deep breaths and RRRRIIIIIPPPPPP!

I heard the Teleprompter technician say “oh my God!” And there I was, staring through watering eyes at the massive patch of fur that was stuck to the tape… and not on my chest where it belonged. Yup… just like a day in the spa that ends up with you getting your chest waxed!

All in all (except for the last part) it was a great day, a lot of fun, and I can’t wait to see the finished product. My number one recommendation to anyone that is going to do this though? Wear an undershirt.

What I’ve been up to lately

My blog posts have been a little scarce (understatement) over the past couple of weeks. This is because I’ve been working pretty hard on a couple of webcasts that I’m going to be shooting tomorrow in Vancouver for the Certified General Accountants.

We’re doing two webcasts on Financial Modeling. The first will be on introducing dynamic components to the model, and the second focuses on techniques to make the model stable and last as it is passed off to other users.

I’ve done many live Excel courses, and love doing them, as I get to see the students and interact with them one on one. This is going to be a new experience for me though, as this is a pre-recorded session with live Q&A afterwards. The company producing the webcasts has been great to work with so far, and after our conference call yesterday I can’t wait to see how these are going to turn out. This should be very cool.

For anyone interested, the landing pages for the webcasts are here:

Running Office 32bit with 64bit

I recently migrated to the 64bit edition of Microsoft Office 2010. My main reason for doing this was that I wanted to make use of the RAM in my machine with PowerPivot, but it certainly didn’t come without a bit of pain. The most notable parts there were that I had to convert a bunch of 32bit API calls in my VBA to 64bit compliant calls while preserving 32bit compatibility for the other machines in my office. Despite my initial intimidation here however, some good friends helped me out and I was up and running within a few hours, and I know believe I understand how to migrate the rest of my code easily. (I may post on that in the near future.)

One challenge I did face though was that you cannot run Office x64 on the same machine as Office x86 (32 bit). To install Office x64 you must uninstall any 32bit version of Office programs first. This means Visio, Project, Office 2007, Office 2003, etc…

My challenge with this is that I teach courses and like (need) to have multiple versions of Office installed so that I can teach in the appropriate versions. I could use a full blown virtual machine, but the problem here is that I find it inconvenient when trying to teach. I can’t flip back and forth between the app and my presentation easily, my presenter mouse doesn’t work in the VM… it just doesn’t work.

So when I installed Office x64, I was a little disappointed. I really wanted to run it side by side with Office 2003 and 2007, but I couldn’t.

But then, in a discussion with one of my staff yesterday, we ended up chatting about Windows XP mode in Windows 7. It IS a virtual machine, but it allows you to run an app installed in the VM as a program from the host (Windows 7) desktop. Here’s a shot out of my start menu of the applications I installed in the Windows XP Mode VM:

So this is pretty cool. I’ve been able to have Excel 2010 x64 open and running on my laptop, and I was able to successfully launch Excel 2003, Excel 2007 and Excel 2010 (32 bit) all together.

The Windows XP mode apps all run on the virtual machine, so they are segregated from my host operating system which makes this possible. I also installed SmartIndenter and MZTools in the VM, and those both show up in the VBE for the apps when I launch it.

I’m really impressed with this. Granted it’s far from perfect application level virtualization, but it allows me to do what I need. All the issues I mentioned above are solved with this. I can run all the versions of the app together, and they seem to run seamlessly.

If you are running Windows 7 and want to check this feature out, here’s the link: http://www.microsoft.com/windows/virtual-pc/download.aspx