SOLVED: Connecting iPhone to L2TP/IPSec VPN on EE

TLDR: VPNs sometimes don’t work on EE’s mobile network due to IPv6. This blog post contains a ready-made device profile that forces IPv4 on EE’s mobile data network.

I set up a L2TP/IPSec VPN on my Synology Diskstation to be able to connect to the home network when out and about. While this works fine when I’m on WiFi, I couldn’t get it to work when on EE’s mobile data network on my iPhone. When connecting VPN I would get the message “The L2TP-VPN server did not respond. Try reconnecting. if the problem continues, verify your settings and contact your Administrator”

There are various posts on https://community.ee.co.uk that suggested it might be to do with Data Content Controls, but enabling Full Access here didn’t help.

There are other posts that suggest it is something to do with IPv6, and disabling IPv6 would do the trick, but iOS 12 doesn’t appear to have an end-user configurable option to do so.

Thankfully, there is a way I discovered thanks to this Apple StackExchange answer using the Apple Configurator which is a freely-available macOS tool typically used by businesses to customise their employees’ iOS devices. Using the Apple Configurator you can create a profile file which is an XML file with a “.mobileconfig” file extension.

By creating a cellular configuration you can specify the APN settings (Access Point Names) used by your phone that lets it know how to log onto the mobile data network of your provider and you can force it to use just IPv4. The EE ones available from apn-settings.com so I filled them in and also set the IP versions to just IPv4

To install the EE IPv4-only profile

Here is the profile I created: EE APN IPv4.mobileconfig ← click this link to download the profile. You will be prompted: “This website is trying to open Settings to show you a configuration profile. Do you want to allow this?”

Click Allow and on the Install Profile screen click Install at the top right:

Follow any prompts to install the profile.

To remove the profile

If you switch to another carrier, especially if you are roaming in another country you might need to remove the profile for mobile data to work (I don’t know for sure, not tried it abroad yet). To remove the profile, go to Settings > General > Profile & Device Management:

From here you can remove the profile.

How to get YouTube to permanently use the larger video player size (without extensions or plugins)

It’s annoyed me for some time that the player size option in YouTube has become increasingly hidden away, and is forgotten every time you restart your browser.

image

Thankfully the option is stored simply in a cookie (wide=1), which is easy enough to set so that it is reasonably permanent without using any browser plugins or extensions.

Google Chrome

If you use the Google Chrome browser follow these instructions:

  1. First go to youtube.com (important, otherwise the cookie will not get set)
  2. In your browser address bar type javascript: (lowercase, the colon is important)
    image 
  3. Paste this code after the javascript: you just typed:
    document.cookie = "wide=1;domain=youtube.com;expires=Fri, 31 Dec 9999 23:59:59 GMT"

Now any videos you watch from now on should be in the larger video player every time you visit YouTube.

Internet Explorer

Recent versions of Internet Explorer don’t seem to allow you to type in arbitrary JavaScript into the browser address bar so you have to use the F12 Developer Tools to run the script.

  1. First go to youtube.com (important, otherwise the cookie will not get set)
  2. Press the F12 function key , or choose F12 Developer Tools from the Tools menu button at the top right:
    image
  3. In the window that appears, click the Console tab and at the very bottom at the >> prompt paste the following code;
    document.cookie = "wide=1;domain=youtube.com;expires=Fri, 31 Dec 9999 23:59:59 GMT"
    image

You can now close the F12 Developer Tools window and enjoy the larger video player every time you visit YouTube.

Hope that helps!

Detecting if .NET Framework 4.5 is installed

There have been a number of posts on this already by Scott Hanselman and Rick Strahl, but I needed a simple way to detect if .NET 4.5 was installed from somewhere basic like a batch file.

I’m not sure where I came across this but it works well for me. The solution is to query for the existence of the following registry key:

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\v4.0.30319\SKUs\.NETFramework,Version=v4.5

So in my batch file I can simply use reg query to do the work:

reg query "HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\v4.0.30319\SKUs\.NETFramework,Version=v4.5" 2>nul
if errorlevel 1 (
    echo .NET Framework 4.5 is NOT installed
) else (
    echo .NET Framework 4.5 is installed
)

Note that there’s a more complicated way documented on MSDN, but the above method is easy to use from a batch file and works well for my purposes.

Hope that helps!

Backing up AppHarbor SQL Server databases using BACPACs

One of the regularly requested features for AppHarbor is a way to get full backups from SQL Server databases there. The official line so far has been to use the AppHarbor-SqlServerBulkCopy tool to bulk copy the data from the hosted server to a local copy. This requires you to have an empty target database into which to squirt the data.

I’ve found a more convenient mechanism to get a simple backup of AppHarbor hosted databases using the *.BACPAC files, which it seems were created originally for getting databases in and out of Microsoft’s Azure cloud-hosting platform.

BACPAC files are Data-Tier Applications (aka DAC) packages which are essentially ZIPs that contain an XML representation of the schema:

… and the actual table data stored in JSON format:

Schema-only packages have a DACPAC file extension instead.

(Bob Beauchemin has some good info on Data-Tier Applications (DAC) on his blog)

You can backup and restore databases in SQL Server Management Studio 2012 (the Express version is fine) – look for the Import/Export “Data-tier Applications” commands on the context menu.

I think “Data-tier Applications” are what we are supposed to call databases now…

Programmatically

If you want to do this programmatically, use Microsoft.SqlServer.Dac.dll which can be found in C:Program Files (x86)Microsoft SQL Server110DACbin if you have some flavour of SQL Server 2012 installed.

Exporting is simple enough:

using Microsoft.SqlServer.Dac;
…
// Export
var dacServices = new DacServices(remoteConnectionString);
dacServices.Message += (sender, e) => Console.WriteLine(e.Message);
dacServices.ExportBacpac(@"C:tempMyDb.bacpac", "MyDb");

And importing is similarly straightforward:

// Import
var dacServices = new DacServices(localConnectionString);
dacServices.Message += (sender, e) => Console.WriteLine(e.Message);

var package = BacPackage.Load(@"C:tempMyDb.bacpac");
dacServices.ImportBacpac(package, "MyDb");

Command-line

There is a command-line tool called SqlPackage which provides access to the same functionality, but it wasn’t until recently that it got the ability to export a bacpac for some reason. The SqlPackage.exe export feature was included as part of the SQL Server Data-Tier Application Framework (September 2012) update.

Download (http://go.microsoft.com/fwlink/?LinkID=266427) and install the appropriate flavour of DACFramework.msi.

Exporting:

"C:Program Files (x86)Microsoft SQL Server110DACbinSqlPackage.exe" ^
    /Action:Export ^
    /SourceConnectionString:"Server=foo.sqlserver.sequelizer.com; Database=dbXYZ; User ID=blah; Password=Pa55w0rd" ^
    /TargetFile:"C:tempMyDb.bacpac"

Importing:

"C:Program Files (x86)Microsoft SQL Server110DACbinSqlPackage.exe" ^
    /Action:Import ^
    /SourceFile:"C:tempMyDb.bacpac" ^
    /TargetConnectionString:"Server=.sqlexpress; Database=MyDb; Integrated Security=True"

Hope that comes in useful for you!

Solution for “Package ‘Microsoft SQL Management Studio Package’ failed to load”

I had an odd issue where SQL Server Management Studio (SSMS) 2008 R2 would start up, display the following error message, then exit:

Package ‘Microsoft SQL Management Studio Package’ failed to load

Using SysInternals Process Monitor tool to monitor SSMS.exe it seemed that it was looking for the following registry key but not finding it:

HKEY_CURRENT_USERSoftwareMicrosoftMicrosoft SQL Server100Tools

By renaming a parent key (rather than delete it – just in case it didn’t fix the issue) solved the issue for me. The key I renamed was:

HKEY_CURRENT_USERSoftwareMicrosoftMicrosoft SQL Server

Running SSMS again recreated the 100Tools sub keys and all was well again. I hope it works for you too.

Once helper for ASP.NET MVC

Here’s an ASP.NET MVC HTML Helper that helps in the following scenario. Let’s say you have a partial that can be included a number of times in a view but has a bit of common code that need only be included once. Typically that bit of common code would be some script.

In ASP.NET such a scenario is addressed by RegisterClientScriptBlock. In the Spark View Engine it’s taken care of by the once attribute.

So, inspired by Phil Haack’s Templated Razor Delegates post from earlier this year I knew you could write helper functions for Razor that could take arbitrary markup. The key to it is taking a Func argument that returns a HelperResult.

To use it you invoke it like so:

@Html.Once("some unique key", @<div>arbitrary markup that gets rendered just once</div>)

More typically, for including script, e.g.:

@Html.Once("TABLE_SORTER_INIT_SCRIPT", @<script type="text/javascript">
    $(function() {
        $('table.sortable th').each(function(){
            // . . .
        });
    });
</script>)

Here’s the implementation that adds the Once extension method to HtmlHelper:

using System;
using System.Web.Mvc;
using System.Web.WebPages;

namespace Foo
{
    public static class HtmlUtils
    {
        public static HelperResult Once(this HtmlHelper html, string key, Func<object, HelperResult> template)
        {
            var httpContextItems = html.ViewContext.HttpContext.Items;
            var contextKey = "HtmlUtils.Once." + key;
            if (!httpContextItems.Contains(contextKey))
            {
                // Render and record the fact in HttpContext.Items
                httpContextItems.Add(contextKey, null);
                return template(null);
            }
            else
            {
                // Do nothing, already rendered something with that key
                return new HelperResult(writer => { /*no-op*/ });
            }
        }
    }
}

Hope that helps!

LESS + CoffeeScript for ASP.NET = LessCoffee

As documented in recent posts, I’ve been tinkering getting the LESS and CoffeeScript compilers running on Windows Script Host. I’ve now got round to wrapping these up as ASP.NET HTTP Handlers so you can easily use them in ASP.NET-based websites. You simply reference the *.less and *.coffee files and they get served up as CSS and JavaScript directly. For example:

<link href="content/style.less" rel="stylesheet">
<script src="content/site.coffee"></script>

No need to install add-ins into Visual Studio or add build steps to your project. The main downside is that it won’t run on non-Windows platforms under Mono (although I’m tempted adapt it to use Mozilla’s SpiderMonkey JavaScript Shell).

If you’re running Visual Studio 2010 then simply use the LessCoffee NuGet package.

PM> Install-Package LessCoffee

If you’re using Visual Studio 2008 you’ll need follow these manual steps:

  • Copy LessCoffee.dll to your web application’s /bin directory
  • Add the following entries to your web.config file:
    <system.web>
        <httpHandlers>
            <add path="*.coffee" type="DotSmart.CoffeeScriptHandler, LessCoffee" verb="*" validate="false"/>
            <add path="*.less" type="DotSmart.LessCssHandler, LessCoffee" verb="*" validate="false"/>
        </httpHandlers>
    </system.web>

    <!-- IIS 7 -->
    <system.webServer>
        <validation validateIntegratedModeConfiguration="false"/>
        <handlers>
            <add path="*.coffee" type="DotSmart.CoffeeScriptHandler, LessCoffee" verb="*" name="DotSmart.CoffeeScriptHandler"/>
            <add path="*.less" type="DotSmart.LessCssHandler, LessCoffee" verb="*" name="DotSmart.LessCssHandler"/>
        </handlers>
    </system.webServer>

If you’re using Windows 2003/IIS 6 then you will need to map the file extensions *.less and *.coffee to aspnet_isapi.dll.

The source is on GitHub, obv: https://github.com/duncansmart/LessCoffee

DDDSW Hecklegate

I went to the free DDDSW developer conference on Saturday in Bristol which was excellent. Kudos to all the organisers and speakers and sponsors who made it happen.

One of the sessions I attended though stood out because the speaker, although apparently experienced, had a pretty tough time, especially with some of the comments submitted to an audience feedback web app being used by attendees at the conference. But actually I found myself agreeing with many of the sentiments of these comments (as did the person I sat next to) and felt the session didn’t go well, although it actually contained some great content. Here’s my take on it.

Starting a session by saying how tired you are and how you haven’t slept for days is effectively saying: “sorry, this might be a bit shit”. You may feel justified in saying this because you are delivering the session for no fee and indeed may have incurred substantial expense in traveling to the conference. Nobody cares. It rubs your audience up the wrong way because they also may have incurred considerable expense in getting there too. In fact it’s their free time you’re saying you may be about to waste. They may start to feel their time would have been better spent in another session. Also, consider that your slot at the conference may have been at the expense of someone else, maybe a newbie who would have loved their first opportunity in the spotlight.

Doing too many “hands up if you…” audience questions can get tedious quickly. Indeed, don’t continually ask people to put their hands up if you’re going to say they’re wrong. It might be OK once, but more than that and people are going to feel uncomfortable and antagonised.

If someone walks out, ignore it. Making a point of it makes you look petty. Just maybe they actually had valid reasons for leaving, or indeed, maybe they weren’t enjoying the session. Just let it go.

Finally, there’s a distinction between being “passionate and opinionated” and coming across as a blowhard.

First steps with IronJS 0.2

With the release of IronJS 0.2, the code below is the result of a 30-minute play I had this morning, which shows how easy it is to embed a fully .NET JavaScript runtime in your application by simply referencing IronJS.dll.

It’s changed quite a from prior versions and I think you’ll see it has become much easier to host since  Dan Newcombe’s experiments last year.

//reference IronJS.dll
using System;
using System.IO;

class IronJsDoodles
{
    static void Simple()
    {
        var context = new IronJS.Hosting.CSharp.Context();
        object result = context.Execute("1 + 2;");

        Console.WriteLine("{0} ({1})", result, result.GetType());
        // "3 (System.Double)"
    }

    static void InteractingWithGlobal()
    {
        var context = new IronJS.Hosting.CSharp.Context();

        context.SetGlobal("a", 1d);
        context.SetGlobal("b", 2d);
        context.Execute("foo = a + b;");

        double foo = context.GetGlobalAs<double>("foo");

        Console.WriteLine(foo);
        // "3"
    }

    static void AddingHostFunctions()
    {
        var context = new IronJS.Hosting.CSharp.Context();

        // Effectively the same as context.CreatePrintFunction() 🙂
        var print = IronJS.Native.Utils.createHostFunction<Action<string>>(context.Environment,
            delegate(string str)
            {
                Console.WriteLine(str);
            });
        context.SetGlobal("print", print);

        context.Execute("print('Hello IronJS!')");
    }
}

Hope it helps you get started.

SOLVED: MSDeploy error “(400) Bad Request”

While working from home I was trying to use MSDeploy (aka Web Deploy, or the Publish Web command in Visual Studio 2010) to update an internal site. Whilst this would work perfectly when I was physically in the office, when working from home via the VPN it would fail with the following error:

Remote agent (URL http://myserver.example.com/MSDEPLOYAGENTSERVICE) could not be contacted.  Make sure the remote agent service is installed and started on the target computer.
An unsupported response was received. The response header 'MSDeploy.Response' was '' but 'v1' was expected.
The remote server returned an error: (400) Bad Request.

To see what was going on I started Fiddler and tried the publish again. (One crucial thing I had to do for Fiddler to capture traffic when connected via the VPN was to fully qualify the machine name, so instead of http://myserver, use http://myserver.your-corp.net as the service URL, otherwise it didn’t capture the traffic.)

This is what the exchange looked like:

POST http://myserver.example.com/MSDEPLOYAGENTSERVICE HTTP/1.1
MSDeploy.VersionMin: 7.1.600.0
MSDeploy.VersionMax: 7.1.1042.1
MSDeploy.RequestUICulture: en-US
MSDeploy.RequestCulture: en-GB
Version: 8.0.0.0
MSDeploy.Method: Sync
MSDeploy.RequestId: fde03509-b23e-4759-9353-e8dbf19a2293
Content-Type: application/msdeploy
MSDeploy.ProviderOptions: H4sIAAAAAAAEAO29B2AcSZYlJi9tynt...
MSDeploy.BaseOptions: H4sIAAAAAAAEAO29B2AcSZYlJi9tynt/SvV...
MSDeploy.SyncOptions: H4sIAAAAAAAEAO29B2AcSZYlJi9tynt/SvV...
Host: myserver.example.com
Transfer-Encoding: chunked
Expect: 100-continue
...

And in response:

HTTP/1.1 400 Bad Request (The HTTP request includes a non-supported header. Contact your ISA Server administrator.)
Via: 1.1 IBISA2
Connection: Keep-Alive
Proxy-Connection: Keep-Alive
Pragma: no-cache
Cache-Control: no-cache
Content-Type: text/html
...

There in the clear was the crucial error information that MSDeploy was failing to relay: those whacky MSDeploy HTTP Headers were being blocked by our ISA Server. (Note to the developers of MSDeploy: showing this information in debug or verbose modes would be very useful!)

After specifying an access rule on the ISA server to not filter proxied requests to the server in question based on HTTP headers, it all started working again.