Thursday, December 23, 2010

What is the DataField Name of a GridView bound to a String Array?

This is a special case that I don’t remember experiencing before but resolved with the help of a StackOverflow question Binding an ASP.NET GridView Control to a string array.

In a simple Asp.Net Membership management application I was simply binding a Role list to a GridView so I could click a link to see which users were members of a selected Role. I was using the following syntax in the CodeBehind where grvRoles is the GridView:

grvRoles.DataSource = Roles.GetAllRoles

Note: Roles.GetAllRoles() is in the System.Web.Security namespace which returns a string array of Roles from the aspnetdb Membership Database.

On the Designer:

<asp:GridView ID="grvRoles" runat="server" 
        <asp:CommandField ShowSelectButton="True" />

Which produced the following output with the GridView auto-generated columns. Notice the column name is “Item”, but imagethe datasource is a string array which doesn’t technically have a DataField name like if it was bound to a Generic List or Collection of class objects.

I want to use a LinkButton where the CommandArgument is the Role name which is represented as a String value of the array. So If I convert the “Select” Command button column to a Template field, what do I use as the DataField name? Generally I would use the syntax:

<asp:GridView ID="GridView1" runat="server" 
    <asp:TemplateField ShowHeader="False">
            <asp:LinkButton ID="lnkMembers" 
                runat="server" CausesValidation="False"
                CommandArgument='<%#Eval("Item") %>' 
                CommandName="Select" Text="Members"></asp:LinkButton>


Note the CommandArgument=’<%#Eval(“Item”) %>’, but that won’t work, because the String Array doesn’t actually have an Item property. Instead you can use the longhand syntax: CommandArgument='<%# container.dataitem %>'.


If I don’t want to have the GridView auto-generate the columns, what do I use for the asp:BoundField name? Apparently there is a secret syntax by using an exclamation point <asp:BoundField DataField="!" HeaderText="Role" />.

The breakpoint will not currently be hit. No symbols have been loaded for this document.

Five foot bug art in Las Vegas Bellagio's conservatoryDoug Butscher’s blog post about the subject was able to get me debugging again in one of the Visual Studio 2010 projects in my solution. As he noted in Step 3, the Modules window indicated that the Symbols were loaded.

I stopped the debugger, deleted only the obj folder, and restarted the IDE. That was enough to fix it for me.

Tuesday, December 21, 2010

Vista Backup Stopped Working - Fix

Backup for the BackupI’ve been using Windows Vista’s built-in Backup and Restore functionality daily for over three years now, with good success until recently. I recently started experiencing a backup error every day for the incremental backups as well as when I attempt to create a new full backup.

The backup did not complete successfully. The file or directory is corrupted and unreadable (0x80070570)

This error message was not very helpful because it didn’t indicate which file or directory was corrupt. It also didn’t indicate whether it was my backup source Operating System partition, Files partition, or the backup destination device.

The Fix

I ran numerous utilities including the Seagate Tools (manufacturer of the source and destination hard drives) on both drives which didn’t detect any problems. The final fix was opening a Command Prompt and running “CHKDSK C: /R” where the C-drive was my primary OS disk. Because the Operating System was in use, the Check Disk utility couldn’t run until the next restart.

Backup Redundancy

The Complete PC Backup, where it creates a VHD image file of each of the backup drives, was working the entire time which provided some level of comfort while troubleshooting the traditional backup process.

photo credit: / CC BY 2.0

Monday, December 20, 2010

Visual Studio Cloud Service for Azure Development Requires IIS7

View above the clouds over NevadaLately I am really wondering about the utility of using the Visual Studio built-in web server for development. Last month it was causing inconsistent rendering of AJAX modal popups, this month installing the Visual Studio Cloud Service for Windows Azure development requires IIS7.

Having the built-in web server seemed like a way to minimize the system resources and attack surface of the development workstation, but it doesn’t seem as practical anymore.

Monday, November 29, 2010

ReportViewer Woes with Tight Integration between Visual Studio and SQL Server and No Backward Compatibility

Traffic Light ChaosLately I find myself creating some quick client reports (ProcessingMode=local) using the Visual Studio 2010 ReportViewer control and the Reporting Services Tablix control. The issue is that my customer is on SQL Server 2000, not 2008 as is officially supported by Visual Studio 2010 and the rdlc Reporting Build Provider. How do I know?

Mainly because of the following compile error:

“The report definition is not valid.  Details: The report definition has an invalid target namespace '' which cannot be upgraded.”   

The lack of backward compatibility was not immediately obvious, but I still wanted the ability to use the newer Tablix functionality in the VS 2010 IDE and have the out-of-the-box export functionality of the Reporting Services client reports. I’m not a fan of workarounds as I have written about before, but occasionally under special circumstances with the pros and cons heavily considered, I entertain the idea. In this case, a SQL 2008 upgrade is on the schedule in the next few months.

Workaround for Visual Studio 2010

Visual Studio 2010 cannot use any of the SQL Datasource Wizards or any other database related wizards with a pre-2005 database, which is disappointing for anyone developing “big corporate” applications as they are often late adopters. To work around this limitation in development, I backed up my SQL Server 2000 database and restored it as a SQL Server 2005 database so I could use the IDE for the Create Report wizard. Obviously, prior to deploying to production SQL 2000 servers, the database DDL and DML changes need to be scripted back out (in SQL 80 compatibility mode) and tested against SQL 2000 instances. Please don’t skip this step!

Workaround for SQL Server 2000 RDLC format

This is my least favorite part of the work around because it can be easy to forget to undue before deployment.

The application I am working with is a website project, not a web application. The compile error occurs when I try to “Build” the project from the IDE “Build” menu. If instead, I just navigate to the web page the report is hosted in (hosting in IIS, not the IDE integrated web server), the page and its report render correctly with no errors. In order to use the IDE’s build functionality, I need to “exclude” the webpage and rdlc file it is using while working on other pages of the project.

To exclude a file from the project, right click the filename from the Solution Explorer window and click “Exclude from Project”. Excluding a file in VS 2010 renames the file to [filename].[ext].exclude so you may get a Visual Source Safe or Team Foundation prompt if your files are under source control (recommended).

When I am ready to deploy or test the pages with ReportViewer controls, it is necessary to un-“exclude” the report aspx pages and .rdlc report definitions.

photo credit: Andy Welsh / CC BY 2.0

Thursday, November 18, 2010

Five Reasons to Use a Staging Server

Concert Stage - Green DayIf you are interested in avoiding common and difficult to reverse issues when deploying new or updated software to a production environment, use a staging server – at a minimum. Your staging server should be configured as closely as possible to the production server. With virtualization, the process of cloning a server becomes a relatively trivial task.

A staging server provides a safe vehicle to “discover” and prevent commonly overlooked deployment issues:

  1. missing assembly references or resources. This is a frequent occurrence if you use third party tools like Infragistics, Telerik, or another. When you install the development tools, they often add the assemblies to the GAC on your development box and scripts and images to “special” virtual directories. If you blindly deploy an application to production without making sure you have the assemblies either in the GAC or referenced properly in your application, it can frequently lead to run time errors and the yellow screen of death. Even items you might think are part of the .Net Framework that are pre-loaded in your Visual Studio toolbox, such as the ReportViewer control require the installation of a ReportViewer Redistributable executable.
  2. dependencies on physical hardware. Depending on how the application is configured, your application may unwittingly be relying on a writeable F drive where on the server the F drive may not exist or may be a CD-ROM drive. Or require a 64 bit processor on a 32 bit server. It is better to catch these issues before deploying to production.
  3. require elevated file system permissions. If your application is writing XML files or storing media, the application upload path will require Write permissions for the accessing account. Permissions requirements should be part of the installation/upgrade process, not left to post installation troubleshooting.
  4. require least privileges on database permissions. You may be using the sa account (not recommended)on your development box, but your production DBA and/or company policies won’t appreciate that in production. This is an opportunity to make sure your application is running under an account or group with the least privileges necessary to run the application and make sure if your installation process is dropping/re-adding stored procedures or tables, you are also adding the appropriate grant select, insert, update, delete, execute, bulk insert, or truncate permissions to avoid unexpected SQL errors.
  5. require configuration changes to interface with other services, such as IIS version differences, SMTP, or Message Queues. On your development box, for simplicity, you might be using a local IIS SMTP service, or file system implementation to test emailing functionality, but if the production environment only allows mail to relay through a Microsoft Exchange endpoint, you have some additional configuration to do. Making sure all the appropriate accounts are understood and configured properly often takes coordination with other IT specialists. They’ll appreciate it if you coordinate it at their convenience on a staging server instead of elevating the issue to a production emergency.

This isn’t intended to be an all inclusive list, but hopefully if you are not using one today, it will inspire you to consider using one in the future. Prevention is the best medicine for software implementation.

The staging process provides an opportunity to perfect the installation process for a problem free installation. Additionally, having the installation process consolidated to a single series of installation and configurations steps in a single package with all the necessary files and scripts can simplify recreating software versions for disaster recovery purposes or for replicating to additional installation deployments, such as for different customers or at a different site.

Photo credit: Anirudh Koul / CC BY 2.0

Friday, November 5, 2010

Ajax Control Toolkit Modal Popup CSS Styling issue

I found an interesting (and aggravating) difference between running a web application in the built-in Visual Studio 2010 development server vs. IIS 7 when using the Asp.Net 3.5 Ajax Control Toolkit ModalPopupExtender control. In the built-in web server (screenshot 1), the CSS formatting did not render correctly as it did in the application hosted in IIS7 (screenshot 2). Initially I used the standard styles you will find on the toolkit sample code.

Hopefully this will help some of the many other people that had the same problem.

Click the screenshots for enlarged views. The code used follows the screenshots.

ModalPopup Visual Studio Development Server

ModalPopup hosted in IIS


<asp:ModalPopupExtender ID="pnlModalNew_ModalPopupExtender" runat="server" 
    DynamicServicePath="" Enabled="True" TargetControlID="btnNew"  PopupControlID="pnlModalNew"
    BackgroundCssClass="modalBackground" DropShadow="true" 
    CancelControlID="btnCancel" >


/* Modal Popup */
.modalBackground {
.modalPopup {

The Panel pnlModalNew referenced by the ModalPopupExtender that contains the data entry form fields had the CssClass style set to .modalPopup.

Tuesday, November 2, 2010

Asp.Net Mystery Membership Database

Sherlock HolmesI’ve used the built-in Asp.Net Membership Provider a number of times for its simplicity and straight forward API. There are a few mystical elements that can trip you up, if you don’t understand how the provisioning process works. There are two primary options:

  • manual
  • auto-magical


I generally create the membership database before I start working on the user interface (seems logical right?). I do that by running the membership provider wizard by opening a Command Prompt, navigating to the .Net 2.0 framework folder and running the aspnet_regsql.exe command (c:\windows\\framework\v2.0.50727\aspnet_regsql.exe on a 32bit OS).

Command Prompt to run aspnet_regsql.exe

That launches the setup wizard where you can choose how you want the database configured in your SQL Server instance:

  • use an existing database such as the application database or
  • a stand-alone membership database if you want to have multiple applications use the same membership database

ASP.NET SQL Server Membership database setup wizard

The wizard then runs the necessary scripts against the selected database (creating the database if it doesn’t exist) and populates it with the tables, views, and stored procedures that the API uses to manage the membership details.


In case you hadn’t guessed, this is method enshrouded by mystery (not really, but it can seem like it). If you don’t choose to manually create your membership database (via the wizard), then the default option as defined in the machine.config file is to use a SQL Express database which ends up getting created in the application’s App_Data folder in Visual Studio.

        <add name="LocalSqlServer" connectionString="data source=.\SQLEXPRESS;Integrated Security=SSPI;AttachDBFilename=|DataDirectory|aspnetdb.mdf;User Instance=true" providerName="System.Data.SqlClient"/>

Depending on your version of Visual Studio and perhaps the method you use to initiate the creation of this database, it might not be obvious exactly where the database is or how it is configured to point to that database file. The corresponding connection string doesn’t necessarily show up in the web.config file because it is inherited from the one defined in the machine.config.

There are multiple ways you can initiate the creation of the membership database. A couple include:

  1. Select “ASP.NET Configuration” from the Visual Studio “Project” menu (screenshot is from VS 2010).image
  2. Drag a login control onto the design surface and select “Administer Website” from its context menu.Administer Website method of creating ASPNETDB.MDF

Upon first use of adding a user, role, or other activity a SQL Express database (ASPNETDB.MDF) will be created including the membership provider’s tables, views, and stored procedures used to manage the membership and personalization system.


In a quick prototype project, I used the auto-magical process and added a few users. You will notice in the App_Data folder, there doesn’t appear to be an ASPNETDB.MDF, only a TeamStrength.mdf file, so where are my users stored?

ASPNETDB.MDF is not visible in Visual Studio Solution Explorer

Upon closer inspection with Windows Explorer, there actually is a membership database in that folder. ASPNETDB.MDF is visible from Windows Explorer

Mystery solved.

You may also be interested in my introduction to implementing ASP.NET Role Based Security presentation or the code samples from the presentation can be found through a link on that blog post.

Photo credit: m_bahareth / CC BY 2.0

Thursday, October 7, 2010

User Interface Design Patterns

custom flash video player controlsUser interface is as important as good coding practices. Here is a link to 40 Helpful Resources in UI Design Patterns.

Smileycat design elements showcase provides good food for thought for a variety of common page elements.

Tuesday, October 5, 2010

Managing Hyper-V Server without the MMC

V-Power Why might someone need or want to manage Hyper-V without the Machine Management Console snap-in? Maybe you want to use XP, Server 2003, or a Vista/ Windows 7 Home Premium or lower edition which is not supported by the Hyper-V MMC. This post is most relevant to a Hyper-V™ Server 2008 R2 installation, not Server 2008 R2 with the Hyper-V role.

Four options* for managing the Hyper-V environment:

* I will leave it as a homework assignment for you to determine which option may work best for you – if any.
photo credit: / CC BY 2.0

Hyper-V Poster

Slightly off topic, but Microsoft released a complex yet easy to read and informative Server 2008 R2 Hyper-V component architecture poster. A small snapshot of the poster is represented in the image below.

Wednesday, September 29, 2010

Blocked and Encrypted Javascript Files and IIS Windows Authentication

lockout tagout I ran into an interesting issue with an application using a couple of javascript files I downloaded from the internet to modify and use in a web application for drop-down navigation. As long as I was accessing them using http://localhost, the javascript cascading menu functionality worked fine.

Windows Authentication

As soon as I started testing using http://machinename instead of localhost, I was prompted for my PC’s username and password. No problem, I realized I hadn’t disabled Windows Authentication in IIS 7 (since I was using Anonymous Authentication). When I disabled Windows Auth, the javascript didn’t work any more – even from running on the same physical PC. I turned Windows Auth back on and the javascript worked again.


I figured there was probably some kind of trust issue, so I looked at the javascript filenames in Windows Explorer and found the names were colored green. I right clicked the properties and realized they were flagged as “Blocked” because they were downloaded in a zip file from the internet. I’ve run across that before so I clicked the “Unblock” button.


The files were still colored green and the javascript still didn’t work. I found that if I right clicked the filename, clicked on properties and “advanced”, the files were also encrypted. I unchecked the “Encrypt contents to secure data” checkbox and the javascript started working.

Problem solved.

photo credit: OhlieVher A. Arango / CC BY-ND 2.0

Monday, September 27, 2010

Project Phoenix Helps Non-Profits and Underemployed Developers

Phoenix rises from the ashes Unemployed or underemployed software developers have an opportunity to receive/earn a Visual Studio Ultimate license with MSDN among other great tools and learning opportunities. To qualify, the developer proposes a software project for a 503c non-profit, school, or church. See Arnie Rowland’s Like a Phoenix rising from the ashes post for full details about the program. The community benefits by receiving a service (software development) that might not otherwise be available to them.

Programs like these can only happen through the generosity of the MVP’s, and the numerous sponsors that have teamed up donating their time, talents, and/or products and services to the project.

photo credit: Gabi Agu CC BY 2.0

Thursday, September 16, 2010

Top Four Uses for Remote Desktop

  1. r/c helicopter Manage multiple PC’s or servers from a remote computer or location.
  2. Keep your data on a desktop in a secured premises and access it remotely. If your laptop gets lost or stolen, the data is still safely tucked away on your desktop at the home base.
  3. Fix a friend or relative’s PC without them lugging their equipment to your house or requiring a house call.
  4. Access your laptop from another computer when the laptop display or desktop monitor stops working.

photo credit: Locutis CC BY-SA 2.0

Wednesday, September 15, 2010

Why I Almost Switched from Hyper-V to VMware vSphere

I gave Hyper-V the good ol’ college try. I had a Windows 7 instance and a Windows Server 2008R2 Web edition instance installed. A few of the limitations that I didn’t like about Hyper-V:

  • Management Console only supported on Windows 7 Ultimate or Vista Ultimate (not Home Premium or XP – pretty limiting for a home virtual environment)
  • I couldn’t allocate any more RAM in total than was physically present on the Hypervisor – even if the instances were off. That doesn’t make it very easy to spin up a development or testing instance for quick testing and spin down
  • The hypervisor management console locked up causing me to hard boot both server instances and the hypervisor to get it running again
  • If I remoted or logged directly into the hypervisor (Server 2008R2 core install) itself and accidentally closed the command line window and cscript window, I couldn’t find a way to open them again without shutting down the server instances and power cycling the hypervisor. Yikes! that isn’t a very good production option.

The “Almost” in the title is because VMware vSphere and also Citrix XenServer both had compatibility problems with my hardware. Hyper-V worked right out of the box.

Friday, August 20, 2010

Hyper-V Could not initialize memory: there is not enough space on the Disk

Las Vegas a Virtual Paris I received this error when I tried to increase the memory of the single freshly installed VM (Server 2008R2) on my Hyper-V 2008R2 host from 1024Mb to 2048Mb. This is a new clean 4Gb machine with almost a terabyte of free disk space so I was surprised by the error. The issue comes from how memory is allocated if the VM state is saved and how I installed Hyper-V on a bootable 8Gb USB drive. I found the clue to the answer in Robert Larson’s article on Hyper-V File Storage and Permissions


In my setup Hyper-V itself is running on the USB drive. The USB drive simply contains a .vhd file that holds the bootable Hyper-V image. The image itself takes up about 6.5Gb which leaves 1.5Gb free.

The Problem

When a VM is initialized it writes a .bin file the size of the amount of RAM allocated to the VM. The default path to that is on the Hyper-V host drive, in my case the USB drive. I had already pointed the Virtual Hard Disk and Snapshot paths to the 1Tb SATA drive because I knew they wouldn’t fit on the USB drive (duh), but I didn’t realize that the location of the VM configuration would need space for all the VM memory dumps.

The Fix

Change the Virtual Machine path to use the 1Tb SATA drive also.

Thursday, August 12, 2010

Case Study on Continuous Improvement in a Professional Association

Download the Case Study on Continuous Improvement in a Professional Association I wrote this case study as an effort in process journalism so other current and potential future group leaders would have a process improvement reference. Download the Case Study on Continuous Improvement Running a Non-Profit Professional Association chapter.

You might also be interested in my popular post about 10 Reasons to Join a Professional Organization.

Wednesday, August 11, 2010

Hyper-V on a USB drive

Not the real Jack Sparrow - looks virtually the same (photo courtesy Dean Willson) In preparation for setting up a virtualized development environment on some new iron, I ran across this article series on Hyper-V virtualization in a flash – part 1, and part 2. I was planning on installing the Hyper-V on a separate hard drive, but I think I’ll try this first.

Sadly, my virtualization needs have outgrown my previous process of using client based virtualization like Sun VirtualBox and Microsoft Virtual PC.

Wednesday, August 4, 2010

If You Cannot Load Projects in Visual Studio 2010 using IIS

If you are running Visual Studio 2010 on Vista and using IIS to host your development projects, you probably need to run Visual Studio as an Administrator for VS to have access to load the project. How do you know? You’ll probably see “unavailable” next to the project name as shown in the screenshot below.


You can fix it by right clicking the Visual Studio launch icon from your Start menu and clicking “Run as administrator”, but that is a one time use action.


If instead you click on “Properties”, then select “Advanced” you can check the “Run as administrator” box and Visual Studio will always launch with Administrator privileges when you launch VS from that Start menu link.


This is the same operation that was necessary with some older versions of Visual Studio.

Wednesday, July 28, 2010

Introduction to SSRS Report Builder

I made a short 30 minute presentation to fwPASS yesterday on the topic of Report Builder after Mark Dalman presented an Introduction to BizTalk. The intention was to present on Report Builder 3.0 (SQL Server 2008R2 version), but unfortunately I had SQL Server installation issues that prevented it. I presented using SQL Server 2005 Report Builder instead, but I highlighted some of the new features found in Version 3.0.

The first three slides are just tidbits of fwPASS information followed by the presentation.

A side note: Other than slide 8, I created the entire presentation using Microsoft’s PowerPoint WebApp on my Windows Live SkyDrive account. It is a pretty nifty web-based alternative to Google Docs

Monday, July 26, 2010

Using Virtualization in a Development Environment

Not the real NYC, but it has its advantages (photo courtesy Dean Willson) I’ve been using virtual desktop environments in the development and testing arena for years, primarily through Microsoft Virtual PC and Sun VirtualBox. However, they bring with them as many challenges as advantages.


  • When your PC replacement cycle is every two or three years, it can dramatically reduce the time it takes to migrate to your shiny new hardware. Just load the virtualization software and point to your virtual machine configuration file/hard drive image.
  • If you don’t use the same resource hungry applications and services every day, you are keeping your host machine operating at a higher average performance. Example: You don’t need to have SQL Server 2000, 2005, and 2008 along with all the SQL Agents, and other peripheral services consuming resources in the background if you only do database development 25% of the time. It is easier to spin up a development instance than manually turn off all those background services.
  • It can reduce the risk of data exposure due to theft or loss. For example, if I’m traveling to make a presentation, I can leave my development environment back at the office to minimize the chance that if my laptop is stolen my work and data is compromised. According to the Open Security Foundation and 21% of data loss in 2009 was from stolen laptops/PC (incidents by breach type).
  • For testing software or InstallShield builds on multiple versions of Operating Systems, it takes much less time to spin up a Virtual PC or Virtual Server than the old way of re-imaging a PC using Norton Ghost.


  • Performance of applications on the guest is rarely as good as running natively on the host operating system.
  • You need to have a license for each operating system and licensed applications that you run in a guest OS in addition to the ones you might need on the host. 
  • Backups configured in the guest only run when the guest in powered on during the scheduled backup times.
  • Options for running 64 bit guests are much more limited and often more costly than 32 bit guests. Must have a minimum of dual core and sometimes requires special hardware based virtualization on the Processor, such as VT-enabled processors.
  • Opening the save Virtual Hard Drive image on a different machine can cause Windows OS’s to require re-activation if they sense certain hardware changes. You can only activate Windows OS’s a limited number of times via the internet before you need to call Microsoft to get an alternate activation code.
  • If you have a static hard drive size configured, re-sizing can present a real challenge.

This list is not comprehensive. Virtualization is a powerful technology, but it requires careful consideration when putting it into real world application. These are just a few considerations.

Friday, July 9, 2010

SSMS Tools Pack – Handy SQL Server Management Studio Utilities

swiss army knife I ran across the SSMSToolsPack while reading the comments from a post on dynamically generating CRUD Stored Procedures. It has some other nice features like saving snapshots of the execution plan to the clipboard.

It can also generate data insert statements from query resultsets, tables, or a database. That can be pretty handy for a number of reasons like:

  • populating lookup tables in the initial deploy scripts of a new application or application update where a lookup table is added
  • dropping tables and recreating test data for testing installation scripts and application testing


photo attribution CC BY-ND 2.0

Thursday, July 8, 2010

Expression Design 4 Crashes Windows Explorer

Windows Explorer Stopped Working error I installed Microsoft Expression Studio 4 alongside Studio 2. I opened a jpg file, manipulated it by adding text, and attempted to save the .design file. Expression Design 4 crashed before the save finished. No, it’s not a beta version of Studio. Now every time I open my “Documents/Expression/Expression Design” document folder on Windows Vista SP2 with the folder set to the icons view (thumbnails) and start to scroll the window, Windows Explorer stops working and restarts. It appears to only fail on the folder that I was trying to save the .design file into when Expression Design failed. In order to see the contents of the folder, I had to change the folder view to any of the non-thumbnail options, like Details or List.

Windows Explorer Folder View Options

Tuesday, June 1, 2010

IndyTechFest 2010

I am a Windows 7 phone programmer. At least that’s what Dave Bost, Bill Steele, and Jesse Liberty told me. That makes me feel pretty trendy.

With eight tracks of sessions I found myself wanting to attend two or three presentations simultaneously every hour. Unfortunately I had to choose just one.

Sessions I attended

A. Keynote with Jesse Liberty.

1. Windows Presentation Foundation for Developers

2. Introducing Windows Phone 7 Series

B. Lunch: Networking. It’s pretty fun to go to a regional event with over 400 attendees and realize that you know about 10% of the people there. I wanted to attend the Building a Company and Operating a Consultancy lunch and learn, but the room was packed full by the time I got my lunch.

3. Building Applications on Windows Phone 7 with Silverlight

4. Building a Data Mart 101

5. Application Development with Silverlight 4. Jesse mentioned Adam Kinney’s name in this session. That always makes me think I know a celebrity.

Thank you!

A big thanks goes out to the organizers, speakers, sponsors, volunteers, attendees, and anyone else that had their hand in the event! Thank you to my buddy Dave Fancher for giving me a ride from the Marriott North to the Marriot East.

The logistics, food, venue, prizes, etc. get better every time they hold the IndyTechFest. There were so many prizes from the sponsors it literally took an hour to raffle them all off – and they were even doing it quickly and efficiently.

Tuesday, May 4, 2010

Microsoft Word 2007 Equation Tools Hotkey

Microsoft Word Equation Designer I was attempting to use the Alt+0153 notation to insert a Trademark symbol in a Word document so I held down the [Alt] key and pressed the plus key (+). It didn’t insert the Trademark symbol, but it did pop up the Equation tools designer. If you’re a geek like me, you would probably find it interesting if you haven’t experimented with it yet.

Now back to figuring out how to add the Trademark symbol without having to open Character Map, find the symbol, and copy and paste it from there. Hint: Hold down the [Alt] key and type the code representing the Trademark symbol, 0153, using the numeric keypad – no plus sign involved. Copyright = 0169.

Wednesday, April 21, 2010

The Value of Peer Code Reviews

Code Review photo Fellow .Net Users Group members Richie, President Armanda, and I facilitated a Round Table discussion on Peer Code Reviews. In the words of one of the attendees,

“That was one of the best topics and Round Table discussions we have had.”

I agree completely, although I am a little biased because I suggested the topic. The discussion covered the following:

  • reasons for code reviews (best practice, quality, compliance)
  • benefits
  • who to include (and exclude)
  • cultural prerequisites – not to be used in performance evaluations
  • pre-review prep (code standards, guidelines, checklists, individual review/notes)
  • the review itself (frequency, roles, scope, mechanics, tools, demonstration)
  • metrics
  • most common findings
  • software available – code coverage, standard practices compliance (styleCop and fxCop type software), code review itself

In addition to improving the quality of the code under review, almost unanimously the greatest value is that the process makes better developers. It provides a process that facilitates improvement through:

  • self motivation to improve – respect from peers
  • understanding your personal “most likely to miss” types of bugs or coding practices
  • coaching, teaching, and mentoring from other team members
  • understanding the expectations of your programming and testing responsibilities

Additional related resources:

SmartBear Software - Best Practices for Peer Code Review

Roiy Zysman's Code Review Checklist (or how to avoid the code review reaper..)

Charles Vas's Code Review Checklist

Photo attribution: / CC BY-SA 2.0

Monday, March 8, 2010

Mind your Copy to Output Directory Setting on Writeable Files

I’m used to doing more web development so I sometimes forget that just because an application writeable file (.xml, .csv, .sdf, .txt, etc.) resides in the project root directory structure, doesn’t mean that is the actual file that is getting written to when developing WPF applications. Depending on the “Copy to Output Directory” setting, that actual file may or may not get copied to the bin/Debug folder which is where the executable program resides (when debugging).

The “Copy to Output Directory” setting options are:

  • Do not copy – If you use this option, but you make changes to the copy in the project directory, the changes will not get copied to the bin/debug directory which the executable uses at runtime.
  • Copy if newer – Be careful with this if you are sloppy about making changes in one location one time, and another the next. What changes are overwritten with each compile can be unpredictable.
  • Copy always – Caution: If you use this, understand that if you make manual changes to the copy in the bin/debug directory, your changes will be overwritten by the version in the project directory at the next compile of the project.

Local Database Cache example

I created a WPF application that stores data locally on the client in a SQL CE (.sdf) database which periodically synchronizes with a server based SQL Server database. Since SSMS 2005/SQL Profiler does not have the ability to connect to a SQL CE database to view the data in the .sdf file, I created a quick “diagnostic” Windows Forms client to peek into the local client database to view the data to make sure the WPF program was operating as expected. I pointed the “diagnostic” Windows Form connection string to the .sdf file in the project root directory, then started Copy Output Directorychanging the data using the WPF application. The Windows Form was not showing the data changing, but the remotely synch’ed server side SQL Server was. What’s going on here?

Oh yeah, that’s right… the Windows Form was looking into the original client database that was created with the local database cache Visual Studio project wizard at the project root - not the live copy that was running out of the bin/Debug directory (See the Solution window screenshot). After I figured that out, I changed the connection string on the “diagnostic” Windows Form app to look at the .sdf file in the bin/Debug folder. Then in order to avoid getting confused about the data I was looking at between the remote database and the local client side database I needed to make sure I understood when the .sdf file in the project root directory was going to get copied into the bin/Debug folder upon recompile via the “Copy to Output” settings on the .sdf file.

Monday, March 1, 2010

Edit Expression Encoder Job Output Templates in Blend

If you are like me and like Expression Encoder, you have probably already found the functionality that allows you to automatically create a web-based Silverlight player and html page wrapper to show off your fancy video editing prowess. Encoder comes with several Silverlight templates for audio only, video, and galleries of video. But what if the functionality they provide doesn’t suit your needs? Almost no problem, you can create a copy of the template and open it in Expression Blend in a few clicks, then customize the player as you see fit.

Click the Output tab in Encoder and select the template you wish to edit from the drop-down list. Click the tiny dot immediately to the right of the Template drop-down list.


You will be presented options for editing the template or a copy of it in Expression Blend or Visual Studio. Depending on the template you choose and what specifically you want to change in the template, the customization can range from relatively simple to “WOW, maybe I’m being too picky about what I don’t like about the built-in template.”

Tuesday, February 2, 2010

Data Mining with SQL Server 2005

I presented this topic to fwPASS on January 26. There was a pretty good turnout. The video was edited for brevity.

If you are interested in the topic, I highly recommend the MSDN Webcast: Applying SQL Server 2005 Data Mining to Enterprise Business Problems which I used in preparation for my presentation. The book I referenced in the presentation is Programming SQL Server 2005, Microsoft Press by Andrew J. Brust and Stephen Forte, specifically Chapter 20.

*The presentation was video taped and the audio was recorded separately with a high quality mic, but somehow the audio file vanished into thin air (read: accidentally deleted) the day after the event so the audio is straight from the camcorder, oh well. The lighting was not good in the video  because the overheads were dimmed for better visibility of the screen.