BX24 and PowerShell for managing a build process

by Mike Linnen 18. July 2006 21:01
BX24 and PowerShell for managing a build process

I have been doing some BX24 development again lately.  I have also been reading a lot about the new shell support that Microsoft has pre-released called PowerShell (formerly known as Monad).  Well since I have been using the same batch files and VBScript files to manage my build process for BasicX source since 2001 I thought it might be time to look at another alternative. 

I need to be able to do the following:
  • Perform command line compiles of the BX24 project
  • Allow for the source to reside anywhere on the hard drive and still be able to compile.
  • Initiate a compile of all BX24 projects so I do not have to do them one at a time
  • Parse the BasicX.err file to determine if the compiler found errors
  • Launch an editor that shows the BasicX.err file only when an error exists
  • Be able to manage some registry entries specific to the BasicX IDE
  • Have a limited set of scripts that do not require any changes to support the build process
  • Allow for multiple project files to co-exist in the same folder. This means I need to save off the BasicX.err file into another file if I want to preserve what the results where from the compile.

After reading some about PowerShell it was very apparent that it would support anything I needed to do.  The main huddle I needed to over come was learning the syntax that revolved around PowerShell.  Fortunately it is based on the .Net framework so the majority of it was fairly easy to adjust to. 

Since I already had a VBScript file that did most of the above tasks I started dissecting what it did first.  The last time I touched this script was in 2001.  The script did the pieces around changing the registry entries and launching the compiler but it had no support for parsing the error file and managing many project files.  Here is the script that I ended up with:

param ([string]$WorkingDirectory)
# Define some script variables$chip_type="BX24"
# Save the current dirrectory so we can return to it
Push-Location
# If a working directory was passed in lets change to it
If ($WorkingDirectory){Set-Location $WorkingDirectory}
# Get the project files to process
$projectFiles = Get-ChildItem *.bxp 
foreach ($project in $projectFiles){$project_file = $project.name.split(".")[0]
# Use the current directory as the working directory
$work_dir = $project.DirectoryName
# Set some registry entries for the basicx IDE
$configEntry = "hkcu:\software\vb and vba Program Settings\basicx\config"
Set-ItemProperty ($configEntry) -Name Chip_Type -value 
$chip_typeSet-ItemProperty ($configEntry) -Name Work_Dir -value 
$work_dir
# determine from the registry where the basicx executable is installed
$program_dir = Get-ItemProperty ($configEntry) -Name Install_Directory
# Map the P drive to the basicx install directory for convieniance
if (Test-Path p:) {}else {subst P: $program_dir.Install_Directory}
# Remove the error file if it exists
if (Test-Path basicx.err){del basicx.err}
if (Test-Path ($project_file + ".err")){del ($project_file + ".err")}
# Launch the compiler
P:\basicx.exe $project_file /c
# Wait for the compiler to finish
$processToWatch = Get-Process basicx$processToWatch.WaitForExit()
# Unmap P: drive
if (Test-Path p:){subst P: /d}
# Check for errors and launch the error file if some do exist
$CompileResult = get-content basicx.err
If (($CompileResult -match "Error in module").Length -gt 0){notepad basicx.err}
# Copy the error file off so it does not get overwritten when multiple
# projects are being compiled in a single directory
copy-item basicx.err -destination ($project_file + ".err")} 
# Restore the original location
Pop-Location

Well that was pretty painless.  I basically had a script that managed processing all BasicX project files in a given folder.  Next I needed to have another script that found all the project folders for a given folder.  This also meant processing projects in sub folders.  This higher level script would launch the script above to do the compile.  I ended up with the following script:

# Save the current dirrectory so we can return to it
Push-LocationSet-Location ..\
# Get a list of all projects
$project_Files = Get-ChildItem -recurse -include *.bxp | sort $_.DirectoryName$lastDir=""
foreach($project in $project_Files)
{
# Since we can have multiple projects in a folder and we send the
# working folder to the build script we want to skip folders we already
# processed
if ($lastdir -ne $project.DirectoryName)
{./tools/build $project.DirectoryName  $lastDir = $project.DirectoryName}}
Pop-Location

Well that too was pretty easy.  I am beginning to really respect the power of PowerShell.  I can do so much more than what I was able to do with VBScript and do it easier.  Later I will but together a sample BX24 project showing how I use these scripts and the folder structure I place them in.

Scrum process for a team of 1

by Mike Linnen 12. July 2006 01:41
Scrum process for a team of 1
del.icio.us Tags: ,

On my flight from Charlotte to Phoenix this week I listened to a number of podcasts.  One of my favorites is Hanselminutes.  Scott Hanselman talks about a lot of thins I am interested in.

In the Line of Scrumage podcast Scott talks about applying the Scrum agile process in his work place.  Most of what he talked about are things that we have been using at JDA Software Group Inc.  However Scott brings up an interesting point about applying Scrum processes to a 1 person team.  I have often thought about this for some of my own personal projects.  I often do small projects for myself and I end up playing the roles of product owner, business analyst, developer, and tester.  So could I effectively use Scrum as a way to manage these projects?  Well I have sort of attempted this on a couple projects and here is what I found out.

The part about building a backlog of stories that are features to get into a product and giving them a priority is something that has worked for me vary well.  I find it easier to express the features that I wish to get into my projects as a story.  The story format lets me capture the user, action and benefit in a quick step.  If I had to spend a lot of time working out the functional documentation of a given feature I am afraid I would not end up completing the feature in a timely fashion.  Even prioritizing the features helps me focus on what needs to be developed next. 

However the process of story pointing features and planning them out over multiple sprints just does not seem to give me any real benefit.  The main reason for this is that since these projects are personal projects they do not get a consistent amount of time dedicated to them.  I might have 3 hours 1 week and 0 hours the next to dedicate to the project.  So planning for sprints and trying to determine my velocity is somewhat difficult.  This aspect of Scrum ends up not being part of my personal projects.  This is ok for me though because I feel more organized by maintaining a backlog with priorities.

Using Virtual PC

by Mike Linnen 14. June 2006 14:48
Using Virtual PC

I have been using Virtual PC for various reasons around software development.  I find it very useful to maintain old development environments.  One of the big gotchas that I have run across is copying virtuals and attempting to use them on the network at the same time.  A copied virtual assumes the same machine identity so it ends up colliding with the original virtual on the network.  I know you can prepare a virtual image to prompt you for a new machine name when it first comes up but I have never looked into how it is done.  I found this article on using Virtual PC in a development environment.  It has a number of tips and tricks I have been using for a while but more importantly it has the instructions on how to make a virtual unique on the network.  http://coolthingoftheday.blogspot.com/2006/06/using-vpc-for-development-and.html

Tags:

Software

Another Wiki

by Mike Linnen 14. June 2006 11:03
Another Wiki
Even MSDN is using a wiki to share information.  I like the fact that you can use the tree control to quickly navigate the content.

Tags:

Software

Code Coverage of web applications in .Net 2.0

by Mike Linnen 2. May 2006 22:36
Code Coverage of web applications in .Net 2.0

In my day job at JDA Software I have been looking at code coverage options for determining the effectiveness of our testing.  My team uses four types of tests to test the software we write.

  • Unit Tests - Tests focused on a single component of the application.  These tests are MSTests that exercise a specific software component and typically mock out any dependant components.
  • Integration Tests -  Tests focused on multiple components of the application.  These tests are MSTests that exercise a component and it's dependencies to ensure the components work together.
  • Manual Tests - Tests that are executed by are testers manually from a GUI interface. 
  • Automated functional test - Tests that are scripted in a fashion that can be repeated build after build to ensure the build is still functional.  Sometimes referred to regression testing.

The goal of implementing a code coverage process was to determine the effectiveness of the types of tests listed above.  Visual Studio 2005 Team Edition (for testers and developers) provides some nice code coverage features we wanted to tap into.  Code coverage of unit tests and some integration tests worked fine from the visual studio IDE.  But for the integration, manual and automation tests that used web services we began to run into problems.  The web services that where hosted under IIS where not getting covered.  After some research I determined that code coverage under ISS was not going to work.  So I started looking into alternatives.  One thing I noticed is that web service projects that where not hosted under IIS had no problem getting covered.  These types of projects used the development web server that comes with ASP.NET 2.0.   So I decided to look into using the same development web server in place of IIS for code coverage purposes.

 

The generic steps for establishing code coverage for web services is as follows:

  1. Turn on instrumentation for the binaries that you want to cover
  2. Start the coverage monitor
  3. Start the development web server for a specified unused port pointing to the folder that is supposed to represent the web service.
  4. Execute your tests
  5. Stop the development web server
  6. Stop the coverage monitor
  7. Review the results in Visual Studio 2005

For the following commands use the Visual Studio Command prompt.

 

Turning on instrumentation of assemblies from the command line is done by the following command:

vsinstr -coverage myassembly.dll

 

To start the coverage monitor you use the follwoing command:

start vsperfmon -coverage -output:mytestrun.coverage

 

To start the development web server use the following command:

start WebDev.WebServer /port:8080 /path:c:\mypath

 

To stop the development web server simply right click on the task bar icon for the web server and select stop

 

To shutdown the coverage monitor use the following command:

vsperfcmd -shutdown

 

We were now able to do code coverage of all types of tests that we planned on implementing.  Some of our test runs are going to executed by the build and some will be executed by the testers.  Since VS.Net 2005 supports the ability to merge code coverage results this should not be a problem to combine all runs into a single report that now shows how effective our tests really are.

 

Some references to articles that help me come to this conclusion:

Command Line code coverage:

http://blogs.msdn.com/ms_joc/articles/406608.aspx

 

Using WebDev.WebServer from the command line:

http://www.devsource.com/article2/0,1895,1886246,00.asp

Putting XP on a USB key

by Mike Linnen 3. March 2006 00:36
Putting XP on a USB key

I found this today http://www.informationweek.com/shared/printableArticle.jhtml?articleID=177102101

I have not tried it myself but if I get a chance I might give it a whirl.

Tags:

Software

More updates on copy podcasts program

by Mike Linnen 2. March 2006 00:16
More updates on copy podcasts program
Well I have been using my Copy Podcast to memory card utility for over 6 months now.  Even though it is only a command line program I have enjoyed using it because it makes moving the podcasts I like to listen to onto my Pocket PC a lot easier.  Only thing I wish I had is a feature to rank some of the podcasts higher than others so that they would get onto my player faster than the podcasts I rank lower.  I think I will go ahead and migrate the program to .Net 2.0 and add in this feature.

VS.Net 2005 and NUnit

by Mike Linnen 17. November 2005 21:50
VS.Net 2005 and NUnit

Well I have been looking at VS.Net 2005 some since it has released.  I wanted to try out some of the new features.  Well I am pretty attached to using NUnit so once I got a little bit of code going in VS.Net 2005 I decided it was time to try NUnit.  Fortunately there is a new iteration release of NUnit (2.2.3) that works with VS.Net 2005. 

So I downloaded it and wrote my first test like I always do.  Create a test project, add a reference to NUnit, add a new class, put a TestFixture attribute on the class, and add a public method that returns void that also has the Test attribute. I then proceeded to fire up NUnit GUI and run the test. However the test I wrote does not show up in the GUI. I fiddled around with the test code for a while and I even downloade the NUnit source code to try and figure out why my test was not seen by NUnit. Well after about 30 minutes of messing around I realized that when you add a new class to VS.Net 2005 project it looks like the following

using System;
using System.Collections.Generic;
using System.Text;
namespace ProtoSystem.Scrum.Business.Tests
{   
   class Class1
   {
   }
}
I never noticed the fact that public does not appear before the keyword class. So the test class could not be seen by NUnit.

Copy podcast program update

by Mike Linnen 30. June 2005 20:26
Copy podcast program update
Noticed another little problem with my copy podcast program. I need to do some clean up on the directories of the destination (SD card). After I delete podcasts on my pocket PC many empty folders start to accumulate. So I need to add a process to clean up empty folders on the destination folder.
So far I have been using this program for almost 2 weeks. I like this setup a lot better than just relying on Doppler and Microsoft Media Player to manage the synchronization process. All I need to do is make the copy program a little more friendly and add a few abilities to the Media Player to support bookmarks and auto delete.

Adjustments to my podcast copy program

by Mike Linnen 22. June 2005 14:45
Adjustments to my podcast copy program
In my program that copies podcasts over to my SD card I noticed a few things that need to be changed.
The way I am using this the copy process is really a move. I am moving the file to the SD card from it's original place on the local PC hard drive. This means the file should retain it's original date. I do not think it is doing that in my program.
Files should be moved in chronological order. This helps to ensure the oldest files get copied first in the event there is not enough room on the destination.

Tags:

Software

About the author

Mike Linnen

Software Engineer specializing in Microsoft Technologies

Month List