Thoughts from Dan Miser RSS 2.0
# Saturday, 27 April 2013
I was using TestFlight for my existing Xamarin MonoTouch project, but it didn't really do what I wanted. I wanted something that would centralize crash reporting for me when something went wrong with my app after it was downloaded from the App Store. TestFlight never really worked for that (at least for me). It allowed me to send beta versions of my app, which was cool, but I wanted a bit more.

I downloaded the binary version of HockeyApp and also set up an account and an app on I got it working by:

  1. Add the reference to the pre-compiled HockeyApp.dll
  2. Place the bundle in the root of my project directory (a bundle is really a directory of files). The instructions on that site no longer work to just Include in Project, so I did an "Add Files from Folder" option
  3. I was sure to add the code to make things crash proof
  4. I was getting a build error when compiling to device (error MT5202: Native linking failed. Please review the build log. Which was really due to Undefined symbols for architecture armv7: "_CTFontCreateCopyWithAttributes"). I fixed that by adding the following into my Additional mtouch arguments: -gcc_flags "-framework CoreGraphics -framework CoreText -framework QuartzCore"

After all of that, I was able to get things compiled. I then followed the deployment suggestions from the page on the github project.

I was able to upload the and ipa files, and get the email invite and tried to download the app. I initially received an "Unable to install the app at this time" error message. That led me to believe the device wasn't imported, so I tried to import using the hockey app bookmarklet. It brought me a list of existing UDIDs back to the hockey app site, but the import did not save those devices. I need to drill down on this some more to get a solid list of steps needed to release both beta and app store versions.

I'd also like to be able to have logging data stream to the hockeyapp server and correlated to a specific device. And I'm not 100% certain I've done things properly, but this was a good cookbook on what I did to get things at least uploaded to the hockeyapp server. If I can get this to work, it's well worth the $10. per month.

Saturday, 27 April 2013 03:56:09 (GMT Daylight Time, UTC+01:00)  #    Comments [0] -
# Monday, 26 November 2012
I wanted to breathe some new life into my Bootcamp running Windows 7 32-bit on my MacBook Pro. I figured the easiest way to do this was to move the entire bootcamp partition over to a brand new Samsung Electronics 840 Pro Series 2.5-Inch 128 SATA_6_0_gb Solid State Drive MZ-7PD128BW. It looks like the best option to get a second hard drive in a MacBook is the MCE Optibay.

Unfortunately, it's not as easy as just throwing the second drive in, moving the partition over and being done. For starters, the Bootcamp Assistant does a little bit more than simply add a Windows partition for you, so creating your own partition didn't seem to work. Bootcamp Assistant did create a parition for me on my second drive, but I then had a partition on both drives named "BOOTCAMP". I then decided the machine was ready for Windows 8 64 bit, but unfortunately, I was greeted with a black screen and blinking cursor when trying to boot off the external USB drive to finish off the Windows install.

After all of that pain (and various utilities to clean up the bad things I did in the last paragraph, like: /sbin/fsck -fy), and some failed attempts to use VMWare Fusion to restore the prior BOOTCAMP partition, I found this thread in I'm reposting a portion of the brilliant post be richlee111 that finally got things working. Granted, it was a lot of opening and closing of the MacBook, and I had to deal with a stripped screw on one occasion, but everything worked beautifully thanks to his advice:

So if you want to run 2 HDDs from your Macbook, with one being for boot camp, the steps below worked for me: 

- Take out the MCE optibay and put back the superdrive into its original location. 

- Install the drive that you want to install boot camp into the original HDD drive bay. 

- Stick the original OSX install disk into the superdrive and first install Mac OSX onto it. 
Realize that you are only doing this to run the boot camp install and will be wiping it out later.

- After you have installed OSX, go through the initial setup and be at the desktop. Run the 
boot camp assistant and go through with the install and have it create a partition for boot camp.
At this point, it doesn't really matter how big/small the patition is for Windows. You can adjust
and resize the partition during the Windows install process for choosing the location and partition.

- Go through finishing the boot camp assistant in OSX, stick your Windows install CD into the drive 
and boot into it. This time it should work. 

- Once you have completed the Windows installation and you are at the Windows desktop, stick the 
Mac OSX cd back into the drive and run the setup.exe. This will install all the drivers that will 
make it recognize all the Mac hardware, etc.

- Finally, take out the CD drive, swap back in the optibay, put your boot camp HDD in there, and 
put back the HDD with your Mac OS. 

Other tidbits of trivia and lessons learned during this process:
  • To get the iSight camera working with Windows 8 inside VMWare Fusion, select the Virtual Machines | USB & Bluetooth | Connect Apple FaceTime HD Camera menu item.
  • Creating a dmg backup of your Bootcamp partition is not recommended. There is no way that I found to restore the dmg to the new partition (yes, I tried the dd command, but it did not end well).
  • This is a good resource page to show how to deal with updating drivers (if you need it), what to do about Retina macs, and even a hint to deal with a freezing problem in Windows 8 if it affects you.
  • Installing IIS in Windows 8

The end result is incredible. It is lightning fast, and my first impressions of Windows 8 are extremely positive.
Monday, 26 November 2012 16:59:26 (GMT Standard Time, UTC+00:00)  #    Comments [0] -
# Saturday, 17 November 2012
After getting my WHS Console issue resolved, I noticed that my Shared Folders tab listed several of my folders as "Failing (Check Health)". That led me to look at the Server Storage tab where one of the drives was reported as "Missing". The drive was dead. Not to worry, I thought, because I had Duplication turned on for those folders that I really did not want to lose (e.g. music, photos).

I spot-checked several items in those folders and noticed that they weren't accessible. I tried to copy them off of the WHS server to my local PC and was greeted with a "Error 0x8007048F: The device is not connected." message. I had removed the bad drive from the WHS and rebooted, and still, things looked bad.

Thanks to a post in the We Got Served Forums, the answer was to simply go to the WHS Console, Server Storage tab, and Remove the bad drive via software as well. After doing that, I had to let WHS do its thing, but after a couple of hours, all of my files are fine, the folders are listed as "Healthy", and I have a new hard drive on the way.

What I thought was a potentially devastating issue turned out to be handled easily, reliably, and gracefully through WHS. What an awesome device.
Saturday, 17 November 2012 14:56:20 (GMT Standard Time, UTC+00:00)  #    Comments [0] -

# Friday, 16 November 2012
I've had pretty good luck with my WHS (v1) system. It's been mostly fire and forget. I've had good success restoring 2 computers from the automatic backup. In short, I've been pretty happy. This is the first of 2 posts of problems I've now had with it. One minor (this one), and the next, potentially devastating.

For a while now, I haven't been able to select the "Windows Home Server Console" menu to connect to my WHS Console. It would sit there forever and just error out.

I was able to RDP in to the box, though, and from there, I was able to get to the C:\Documents and Settings\All Users\Application Data\Microsoft\Windows Home Server\logs\console.*.log file. In there, I saw this:

[11/15/2012 5:32:17 PM 182c] WARN : CreateAndConnectRdp - admin name is WHS\Administrator.
[11/15/2012 5:32:17 PM 182c] Connecting to WHS at path C:\Program Files\Windows Home Server\
[11/15/2012 5:32:28 PM 182c] WARN : Disconnected, reason=2, extended reason=0

I then noticed this file on the server was blank: c:\program files\windows home server\homeserverconsole.exe.config. I deleted it, and rebooted the server, and I was able to connect to the Console again. What I found after being able to connect, though, was most troubling. More on that later...
Friday, 16 November 2012 02:27:44 (GMT Standard Time, UTC+00:00)  #    Comments [0] -

# Thursday, 08 November 2012
The new DisplayMode engine in ASP.NET MVC 4 is rather nice. By default, it will detect whether you come in from a mobile device or a desktop machine, and load up a specialized view for you based on that detection.

For example, if you were trying to navigate to from an iPhone, the default behavior will be to look for Players\Index.Mobile.cshtml, followed by Players\Index.cshtml. Whichever it finds first will be the view that gets used. Note that this also requires you to have a _Layout.Mobile.cshtml file.

This is great for new projects, but I have a non-trivial app with a lot of views that have been built over the years using the technique by Scott Hanselman. That approach would look for the file in Players\Mobile\Index.cshtml. I was not looking forward to renaming all of those files.

In order to use my already existing file structure, I added one class:

public class SubfolderDisplayMode : DefaultDisplayMode
    public SubfolderDisplayMode() : base("Mobile")
        // Be sure to use, or a good browser capabilities file
        ContextCondition = (context => context.GetOverriddenBrowser().IsMobileDevice);

    protected override string TransformPath(string virtualPath, string suffix)
        var dir = VirtualPathUtility.GetDirectory(virtualPath);
        var filename = VirtualPathUtility.GetFileName(virtualPath);
        return VirtualPathUtility.Combine(dir, "Mobile/" + filename);

Registered it in my Globabl.asax.cs:

DisplayModeProvider.Instance.Modes.Insert(0, new SubfolderDisplayMode());

It works very well, and I can come back and rename those files later.

Also be sure to put a call to @Html.Partial("_VewSwitcher") somewhere in your desktop _Layou.cshtml file so the user can get back to the mobile version of the site.

Updated on 11/9/12 to clean the sample code up a bit
Thursday, 08 November 2012 21:11:20 (GMT Standard Time, UTC+00:00)  #    Comments [0] -
# Wednesday, 29 August 2012
After installing the RTM version of VS2012 last week and upgrading my project, I noticed that my web site was broken. The real culprit seemed to be that Microsoft changed how bundling works (the process of combining and minifying resources like js script and css files) between the RC version and the RTM version. For an excellent background on what this feature can do for you, see this tutorial.

It turns out my real problem was with Kendo. They don't provide non-minified files in the trial version, so the new bundling mechanism wasn't including the minified assets, which meant the Kendo minified assets were getting stripped out when I was trying to run under debug. (Reference here). I was able to use the work-around offered on this thread, and things started working again.

However, there are a few wrinkles when using the bundling code, so I thought I'd capture my experiences here. For example:

  • For 3rd party components that don't ship with non-minified assets, you need a better way around the default bundling strategy provided in the kendo article above. The solution provided in that thread of removing the min files from the ignore list will end up duplicating other assets that do provide both a minified and regular version of their assets while running in debug mode, e.g. jquery. There is a workaround for this, though: If you specify your bundle file's pattern with the {version} macro, then the bundling framework is smart enough to include just the one copy of the asset. If you use a wildcard in the pattern (as shown in the thread), you will get duplicate min and non-min versions of the asset when you render. Here is what your code should look like:
    bundles.Add(new ScriptBundle("~/bundles/jquery").Include(
  • When specifying a CDN location, right now, you can't use the {version} macro, so you end up with a hardcoded reference for your CDN link (1.7.1 in the sample) and a {version} macro for the non-CDN reference (which could be 1.7.1, 1.7.2, or anything else that exists in your solution). This means that when you update locally, you need to remember to keep those version numbers in sync. The CDN path could take the version you have locally and substitute it in the CDN path that you provide removing this requirement.
  • In the tutorial, there is a fallback script that they recommend you write after your Scripts.Render statement so you can gracefully fall back to the local version if the CDN version doesn't load. It would be much better if that fallback code would be emitted for you when you call Scripts.Render.
  • Speaking of CDN, it appears that there is a very tight coupling assumed between a bundle and a CDN path. In other words, you cannot include multiple assets in one bundle because the CDN path for the bundle assumes it is a reference to a specific file on the CDN. It would be better to have CDN paths be tied to each individual item in the bundle.
  • Also relating to CDN support: The CDN path will only be used if you set bundles.UseCDN to true AND you either have BundleTables.EnableOptimization set to true or compilation debug set to false in your web.config. Granted, you probably only want to use the CDN when you are pushing to production, but while trying to test things out, I was doing it locally and this caught me by surprise. The 2 items should be independent of each other. If not, why even bother having the UseCDN property?

It seems that there is quite a bit of friction in the current version of the bundling framework. Fortunately, it resides in the Optimization assembly which can be upgraded independently of the entire MVC framework. I hope Microsoft releases an update very soon to overcome these obstacles. I have every reason to believe that they will since I'm seeing the author of this assembly answering tons of questions on StackOverflow.

Wednesday, 29 August 2012 19:39:05 (GMT Daylight Time, UTC+01:00)  #    Comments [0] -
# Monday, 30 July 2012
I'm trying to settle on a standard for a grid component for use in an ASP.NET MVC app. It looks like is the winner for now.

Here is my quick run-down on how to get it working:

  • Using NuGet, install Mvc.Jquery.Datatables
  • Using NuGet, install EmbeddedVirtualPathProvider.
  • In the generated App_Code\RegisterVirtualPathProvide.cs file, add the following line where the comment tells you to. These last 2 steps are needed to get the grid to show up on the page.
    {typeof(Mvc.JQuery.Datatables.DataTableVm).Assembly, @"..\Mvc.JQuery.Datatables"} 
  • The sample found here is a little outdated and the instructions on the page don't match what's happening inside the actual view. Use these instructions instead:
    • In the controller, you need to return a DataTablesResult (not an IDataTablesResult). The assembly relies on finding an action with this return type, so that also means that you can't redirect views, which isn't the worst thing since this should be an AJAX request anyways.
    • In the view, add this to the top (instead of the calls to the link and script tags cited in the page:
    • In the view, the code should look similar to this where you want the grid (remember, you need to have the controller action method return DataTablesResult in order for this to work):
        var vm = Html.DataTableVm("table-id", (DashboardController h) => h.GetWireHistoryData(null));
      @Html.Partial("DataTable", vm)

There is a good series expanding on datatables usag in ASP.NET MVC over at

Thanks to Harry for the NuGet package. It definitely will make keeping things up to date much easier.

Monday, 30 July 2012 20:46:19 (GMT Daylight Time, UTC+01:00)  #    Comments [0] -
# Friday, 27 April 2012
I needed a way to get a list of specific scheduled tasks running on a server. The main problem is that the command-line tool, schtasks.exe, is horrid. There is no ability to filter data. You get all of the data, or you get none. This is the Powershell script that I eventually settled on after piecing together a bunch of StackOverflow and blog entries. The key is the convertfrom-csv cmdlet that turns the result into objects that can be queried on, instead of a list of strings.

It's my first time using Powershell, and I don't know quite what to think. While it helped me solve my objective, and is quite powerful and extensible, it just doesn't feel natural to me. I'm sure that would change over time if I decide to invest time and energy to grok it. If anyone knows of a less awkward LINQ syntax, please let me know because that would turn me around on Powershell in a heartbeat.

schtasks /query /v /fo:csv | convertfrom-csv | where {$_.'Task To Run' -like '*MyProcess.exe*' }
Friday, 27 April 2012 14:36:50 (GMT Daylight Time, UTC+01:00)  #    Comments [0] -

# Saturday, 10 March 2012
There is no way I can list everything I learned with MonoTouch over the last couple of months. I'll summarize my experience by saying I'm a very happy customer. There were bumps and bruises along the way, but between the mailing list and the support crew at Xamarin, I heartily recommend investigating MonoTouch if you're a .NET developer that wants to get to the iOS AppStore quickly. I had to take several detours along the way (converted my existing app from Lightspeed to EF4 CodeFirst, converted to use POCOs, had support obligations, and wrote a sync engine to communicate over ServiceStack, but in the end, things lined up pretty well.

Some of the highlights:

  • I had to write code to essentially mimic the context loading that EF would do for you. This included fixing up object references as well as reading and writing from the SQLite database. Not horribly difficult, but it was something I'd rather I didn't have to do.
  • I encountered a couple of problems executing various LINQ statements when running on the device. A quick test case, and the devs at Xamarin had me with either workarounds or fresh bits to solve my problems.
  • Be sure to embrace threading when making web calls - especially on startup. You have 15 seconds to have your app launched on the device, or the device will think it is hung and kill the app.
  • Deploying to the app store has been written about extensively as a complex and intricate process. It turns out, there's good reason for that. After I got through an error due to linking my release build to ServiceStack.Text.dll, the resulting upload to the app store was failing verification. For some reason, the application name of was not being accepted. I changed the name to and it sailed right through. I have no idea exactly why this was required, but there you have it.
  • MonoTouch.Dialog is a very nice framework for building a line of business app. Be sure to check it out.

When I look back on the road I travelled to get my app to the app store, I'm impressed with how much of the business logic I was able to carry over. The time savings in being able to bring my business logic across as POCOs that have been extensively tested in production over many years was the real reason I went with MonoTouch to begin with. I most definitely do not regret that decision.

Note: I was not compensated or asked to write this post. I am just a happy paying customer of a product that saved me time, and I wanted to share my experience.

Saturday, 10 March 2012 01:48:03 (GMT Standard Time, UTC+00:00)  #    Comments [0] -
.NET | iPhone
# Wednesday, 29 February 2012
The iPhone app that I'm writing uses ServiceStack to communicate with an existing ASP.NET MVC app that I've had in production for a long time. The way I have things set up is that I'm doing my iPhone development on MonoTouch on the Mac side, and I use VMWare Fusion to run Windows as a guest OS. This blog will highlight a few of the tips that I found to be handy.

  1. Getting IIS Express to work from an external server (even the Mac OS host) is theoretically possible. I found articles lying around the net saying it could work, but it never worked for me. I ended up going back to Cassini (WebDev.WebServer40.exe) and using tcpTrace to listen externally on port 8080 and forwarding to my local port (e.g. 1234).
  2. In order to get VMWare Fusion using NAT to talk to my Windows OS on a consistent IP address, I added this section at the bottom of /Library/Preferences/VMware Fusion/vmnet8/dhcpd.conf (replacing the MAC address of the Macintosh and the IP address from the Windows machine)

    host winguest {
    	hardware ethernet xx:xx:xx:xx:xx:xx;
  3. In order to get external devices (e.g. my iPhone connected to the same wireless network) to see in to the Windows OS, I set up port forwarding to route requests coming in to the Mac on port 80 to point to port 8080 on the Windows machine. I did this by modifying this section in /Library/Preferences/VMware Fusion/vmnet8/nat.conf:

    80 =

After all of that, I can communicate from my iPhone through my Mac into the VMWare-hosted Windows machine to get at the data.

Wednesday, 29 February 2012 00:37:46 (GMT Standard Time, UTC+00:00)  #    Comments [0] -
iPhone | Macintosh
<2013 April>
About the author/Disclaimer

The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.

© Copyright 2018
Dan Miser
Sign In
Total Posts: 388
This Year: 0
This Month: 0
This Week: 0
Comments: 630
Pick a theme:
All Content © 2018, Dan Miser
DasBlog theme 'Business' created by Christoph De Baene (delarou)