Compsoft Flexible Specialists

Compsoft plc

Compsoft Weblog Compsoft Website News Archive Privacy Policy Contact Us  

Monday, November 30, 2009

Inline Block not quite Inline-blocking in IE8

So we have been working on a site that has a good amount of buttons on it, but input buttons aren't that easy to make look nice.

So i've been round the site fitting snazzy buttons built out of anchors and spans with backgrounds to complete the look.

We've made use of the 'Inline-block' setting of the display css style. It looks proper nice in Firefox, Safari, Chrome, and Internet Explorer 7.

But.

Internet Explorer 8 makes a right meal out of it!

image

All buttons that are next to each other appear stacked on top. IE8 seems to have a problem coping with 'Inline-block' elements.

However, there is a quick fix it would seem, to this stacking issue. All that is needed to get Internet Explorer 8 to honour the 'Inline-block' style is add a 'margin-right' of some value, like so:

image

and now all the browsers render the buttons nicely:

image

Friday, November 13, 2009

Tim Jeanes - TechEd 2009 - Day 5

WIA308 - The Biggest Little-Known Features in Microsoft Silverlight

(All samples from this session are available at http://www.wintellect.com/downloads/techedeurope2009.zip)

Typically Silverlight uses the browser's connectivity for all network access. However, Silverlight 3 introduces the "client stack", that uses the operation system's networking APIs. This makes it a lot more stable and reliable - for example you now get full access to headers, and when you hit a SOAP error you'll now get the full error details back, instead of the munged version that the browser returns (the browser stack will only ever return a 200 or a 404). This also gives you access to http PUTs and DELETEs, rather than just GET and POST as supported by the browser stack.

On the other hand, the client stack will never give you cached results; also you're limited to two concurrent connections.

You just have to call this:

HttpWebRequest.RegisterPrefix("http://[whatever]",
WebRequestCreator.ClientHttp);

... and then access connections as usual.

An event called CompositionTarget.Rendering is fired about 60 times per second on the UI thread, and can be used for custom animations. It's mostly used for games, but can be useful elsewhere, for example to work through a long-running queue of changes that need to be made to the UI - running on another thread would be inappropriate as that thread wouldn't be able to update the UI.

There's an enhanced frame-rate counter that monitors the amount of memory used by the GPU.. This is shown when EnableFrameRateCounter and EnableGPUAcceleration are both set to true.

BitmapCache.RenderAtScale can be used to scale up controls using the GPU, without pixelation.

A new class called Analytics allows you to monitor CPU load (both by this process and by all processes), as well as information about any GPUs that might be present.

The AssemblyPart class represents an assembly that's part of a Silverlight application. AssemblyPart.Load enables you to load assemblies at runtime, minimising the initial download time of the application. (NB. this can only be called on the UI thread.) By changing the CopyLocal property of a reference to false, you still get full intellisense when developing the application, but it's not sent down in the initial xap file. So long as you dynamically load the associated dll before you try to use it, your application will still work.

Be careful though! The JIT compiler compiles methods the first time it sees them: if this happens before you've loaded the assembly, you'll get a runtime exception. Also, the JIT compiler may choose to in-line some methods, so you don't know for sure when a method will be compiled. The safest way to avoid this (and retain strong-typing) is to use the [MethodImpl(MethodImplOptions.NoInlining)] attribute on the method that uses the dynamically-loaded reference.

Application Extension Services are services that have the same lifetime as the Silverlight application - they start just before the app, and end just after it ends. Implement IApplicationService (and optionally IApplicationLifetimeAware) on a class to make it such a service. We saw a sample where dynamic assembly loading was handled on-demand by such a service. It's a little complex, so check out the sample code.

The VisualTreeHelper class gives you access to the xaml that's been generated by templates in data-bound controls. This can be handy for programmatically styling items in a list, for example.

Silverlight 3 now has support for xaml-styled modal dialogs, using Child Windows.

VirtualizingStackPanel acts like a StackPanel that handles large numbers of items much better, by deferring templating the items until they scroll into view. ListBox now uses it by default, though ComboBox still doesn't.

RelativeSource can be used to bind a property of an element to the property of a nearby control (such as its parent).

AutomationPeer is usually used for accessibility issues, but can also be used to simulate button clicks.

NetworkInterface.GetIsNetworkAvailable() tells you whether or not a network is available. NetworkChange.NetworkAddressChanged is an event that will let you know when the network drops out or comes back.

DEV312 - Using and Extending Microsoft Visual Studio 2010 Architecture and Modelling Tools

A lot of this was typing code, so I won't reproduce it here, but there were a few key points.

Visual Studio uses MEF (Managed Extension Framework) as its extensibility mechanism, so it's now included in VS2010. It gives you extensions for commands, gestures (such as drag-and-drop) and model validators. This does most of the hard work for you, giving you easy points to plug into VS events, so you can just concentrate on the things you want to do.

Model validators can be set to run whenever the model is saved or opened, or run manually from a menu.

VSIX is a packaging technology that uses Office's xml model to package up your extensions and makes them much easier to install.

Whilst developing an extension, pressing F5 launches a new instance of Visual Studio, installs the extension, and enables debugging in the first instance. This instance is entirely separate from a normal one, down to using its own set of registry settings. This is so much easier than it used to be!

There's a whole now (and much, much easier) model for inspecting and altering existing code. This was a third-party library he found, but I'll see if I can dig it out - our code generation tools are a little painful to work with sometimes.

This session focused entirely on extending the class diagram and the UML model diagram, but I believe similar extensions are available for other actions and items in Visual Studio.

DEV301 - Microsoft Visual Studio Tips and Tricks

I was going to try to keep up, writing them here as they were explained, but there were too many coming too fast, and in any case there'll all here: scottcate.com/tricks. Thanks Scott!

(Most of these work in VS2008 as well - you don't have to wait for VS2010.)

Tech Ed 2009 - Friday - Neil Bostrom

Can you keep a secret? The biggest little-known features in Microsoft Silverlight

The default networking calls in Silverlight are all made through the browser. This has limitations to do with when web service return server faults. The browser can not interrupt it correctly.

There are now two networking stacks in Silverlight 3. Browser stack and client stack. The client stack goes direct to the OS network stack. This stack works correctly for web service faults. The browser stack only support 200 and 404 return codes. This is the root of the problem with web service server fault limitation. A limitation with the client stack is that it does not get the content caching built in to the browser.

To start using the client stack, you call HttpWebRequest.RegisterPrefix("http://", WebRequesterCreater.ClientHttp) and all calls after that will be on the client stack.

You can check any request object by checking the CreaterContext.

Silverlight 3 has an updated frame rate counter. To enable that, add ENableFrameRateCounter and EnableGPUAcceleration.

RenderToScale is part of the BitmapCache that allows you to control the size of the bitmap that gets handed off to the GPU by Silverlight.

Analytics class allows logging of CPU load and GPU usage.

AssemblyPart.Load allows you to load up assemblies on the fly. This means you can keep your xap file small and just pulled down the extra parks as and when you need them. To lighten up your xap file, you can change your references to be copy local = false. This means it won't be shipped in your xap. For the dll to be easily downloaded, put a manual copy inside the ClientBin folder.

This technique does have a big flaw. The CLR will die trying to find that reference. The CLR scans the method before it's run for all its types. MethodImpl(MethodImplOPtions.NoInlining) forces the JIT compiler not to scan that method for types it does not know.

Application Extension Services is a service model for building services that run with your applications. All you need to do is implement IApplicationService, StartService, StopService. Silverlight will create the service as it starts and close it just before exiting.

SynchrosationContext.Current has access to the UI dispatcher

VisualTreeHelper is a handy class to grab controls inside Composite controls.

RelativeSource allows two way template binding! Awesome!

AutomationPeer class allows you to click buttons etc.

 

Architecting Silverlight Applications with MVVM

MVVM is only a solution for large scale applications. It's not really ideal for small projects as it can be very bloated with layers etc.

MVVM is Model, View, ViewModel. As you start using this concept it is very similar to MVC. You create a Model which is your entity with data access. View is your control XAML on the page and then you create a model for your view that has everything you can bind to. The advantage with this is that ObservableCollection does not need to exist in your model, you can just expose that in your VIewModel. This keeps your model very clean from UI dependant references.

A cute way to reduce the dependency on the view calling the model in code is to use behaviours that you can just attach to elements in the view. One handy behaviour out of the box is CallDataMethod which can be wired up to to your method on the model.

 

Code Contracts and Pex: Power Charge your assertions and Unit Tests

CodeContracts does validation on your code and comes up with warnings for certain coding errors.  Code Contracts allows you to make code method call intentions on your methods and code contracts will generate the correct code to run that intention.

Pex allows you to do unit testings on firing cases of your code that you might not have considered.

The Code Contracts does make for very bloated code. For interfaces you need to make two more classes to just say something doesn't need to be null.

Code Contracts has a runtime checker, static checker and documentation output. So that single intention gets used in a bunch of places.

Labels: , , ,

Thursday, November 12, 2009

Tim Jeanes - TechEd 2009 - Day 4

WIA304 - Building Line-of-Business Applications Fast With Microsoft Silverlight and Microsoft .NET RIS Services

Given a data model, RIA services quickly generates the domain service classes for your web layer. The generated classes include skeleton methods to get, insert, update and delete the entities. Business rules (such as required fields) are automatically passed through so that the UI can respond to them.

The uriMApper xaml element provides a way of routing URL patterns to xaml controls, making your Silverlight application more website-like, and preserving the functionality of the back button. It uses URLs in the pattern mysite.com#/Customers - the # ensures you're really staying on the same page, so the Silverlight control doesn't have to be reloaded.

A goal of RIA Services is to take away some of the pain of handling asynchronous events, giving you cleaner, more readable code.

RIA Services provides you with some additional Silverlight controls. One we looked at was the DataPagerControl, that attaches to a datagrid and returns pages of data, all with writing virtually no code.

The DataControlToolkit adds a few new Silverlight controls to get you going more quickly with editing data. For example, the DataForm control will provide appropriate input controls for each field on your object.

The ActivityControl can be wrapped round your input form. When its IsBusy property is set to true, it masks itself and shows a Please Wait animation. This property can be bound to your domain service class (that wraps the async calls), thus making it dead easy to give your users a synchronous experience.

To enforce security, the [RequiresAuthentication] and [RequireRole] can be added to methods, preventing them from running if the user doesn't have sufficient credentials.

WIA303 - Microsoft ASP.NET AJAX: Taking AJAX to the Next Level

I've never been a big user of Microsoft's AJAX offering - I felt their control toolkit wasn't particularly good-looking and wasn't flexible enough if you didn't want exactly what it did out of the box, and I found jQuery could do at least as much as I wanted - so I was interested to find out what improvements they've made to "take things to the next level".

First up, Microsoft have launched a Content Delivery Network, hosting many common javascript files, such as jQuery, the Microsoft AJAX.NET javascript library, and the jQuery Validation library (which I really must look into). jQuery.UI's not on there yet, but they're thinking about it.

In .NET 4.0 webforms, you can say <asp:ScriptManager EnableCdn="true" />, and it will automatically get all its script files from the CDN.

The MS Ajax Minifier is a new offering that's available as a command line tool or an MSBuild task that works in two modes: normal minification (that strips whitespace, etc.) or hypercrunching (that goes a lot further - shrinking variable names and removing unreachable code). I think hypercrunching is probably the coolest name Microsoft's come up with for any of its products for quite some time.

The minifier can also analyse your code and warn you of any problems it can recognise.

(As an aside, a nice way to edit your project file in Visual Studio is to right-click it and select Unload, then right-click and select Edit. I never knew that - I've always been opening it in a separate text editor.)

The new AJAX library isn't being shipped with VS2010 or .NET 4.0 - instead it's available as an open source project.

It's written as a pure js library, so it will work equally well in webforms, MVC, or plain html. It integrates seamlessly with jQuery, and all its controls are exposed as jQuery plugins.

start.js contains a script loader that will pull in any other scripts you need, and defined by the Sys.require() function. This makes it easy to ensure you get the all libraries you need, exactly once each. It also supports lazy loading, so you can load libraries at the point that you need them.

Items can be added to the page using Sys.create.dataView. This looks for elements with a class="sys-template", which are treated as templates. The contents of the template are repeated for each item in your data source, and data fields within the template are demarked using {{field-name}}. An itemRendered event is fired for each item to allow you to perform your own custom tasks.

Sys.bind() gives a really easy way to bind an item list with a detail view. You specify an event that fires when an item in the list is clicked, which you then use to populate your detail view. Using templates for both means you have to write almost no code.

A major new feature is the client-side datacontext. This works much like the DataContext in LINQ2SQL in that it tracks changes to records in the browser,which can then be batched up into a single AJAX post when the save button is clicked. This also supports two-way binding, so you can have the text in html elements be automatically updated in response to the user changing the text in a textbox.

WIA401 - Enhancing the Design-Time Experience for Microsoft Silverlight 3

When developing controls to be used in Silverlight, bearing in mind how the designers will experience them in Blend can make a big difference to how ready they are to slap you.

A new feature in Blend is the ability to add sample data. This makes it much clearer what the control will look like at runtime, without having to access any live data.

Attributes can improve how the control appears both in Blend and at runtime. For example, giving a property Category and Description attributes puts the property into its own group in the designer, with tooltip text to indicate what its function is. Further attributes can change how the value is entered, how it is validated, where it appears in the list, etc.

All the design-time code is wasted once you come to deliver your final product, so it makes sense to separate your design-time code into a separate assembly, but putting it into separate projects in Visual Studio. To do this, you must follow a naming convention for the project (and thus dll) names, and put classes in each project with the same names. This will then be respected by the design environment. Your projects should be named  *.Design, *.VisualStudio, *.Design.Editors, *.Expression.Design.

Behaviours are a powerful tool that allow you to specify a lot of functionality without any code. Some are available out of the box, such as making an item draggable, but you can also make your own, configurable by properties in the property box in Blend.

WIA307 - Cool Graphics, Hot Code: Visual Effects in Silverlight

(All samples from this session are available at http://www.wintellect.com/downloads/techedeurope2009.zip)

A class called PageTurn.cs is a class that encapsulates the Page-Turn framework. Given pairs of canvases (for the left- and right-hand pages), this provides a clean page-turning animation.

The WriteableBitmap, new in Silverlight 3, is the root of all sorts of visual goodness. This lets you generate images on the fly, as well as taking a snapshot of controls in the running application. This can be used for anything from drawing Mandelbrot Sets, to capturing frames from a video, producing a draggable magnifying glass or the much over-used Shiny Wet Black Floor effect.

[Aside: If your xaml contains a large picture, but you're scaling it down, then if you make a WriteableBitmap from this, telling it to scale up, Silverlight is clever enough to use the original large bitmap, rather than the smaller scaled one, thus avoiding pixelated blockiness.]

We saw a little more of HLSL for building custom pixel shaders too. It was developed by Microsoft and is used by DirectX, so is very high-performance. It's based on C and looks a little crazy, but it's maybe not as scary as I'd thought. We saw how this can also be used to generate the Wet Floor effect. This has the big advantage that it's real time, so if your original xaml changes, so does the reflection.

However, it's worth noting that (for now) Silverlight only uses the CPU to execute the pixel shaders; if it ever uses the GPU, we can expect much better performance.

DEV316 - Model-Based Testing Made Easy with Spec Explorer 2010 for Visual Studio

Spec Explorer explores your C# code and draws a state machine graph of the possible routes through it. This gives a different view of the model that can be (possibly) more easily compared with the original requirements. From this, it generates unit tests that will trace each route though the code.

It's a little bit crazy and I don't think I'll ever use it, and that's all I've got to say about that.

Tech Ed 2009 - Thursday - Neil Bostrom

Building Line-of-Business Applications Fast with Microsoft Silverlight and Microsoft .NET RIA Services

RIA services provides a pattern to write application logic that runs on the mid-tier that is projected on the client side. Making producing 3 tier application simpler. It's not just restricted to Silverlight.

Tim started off by showing the new Silverlight Business application template that comes with Visual Studio 2010 out of the box. This template seems pretty complete with loads of functionality like forms transitions, login, register, navigation, resources and styling. Very sexy, well worth looking through this template.

Domain service class is the guts of RIA services. When adding that class it prompts to point at your model. It then generates an API using your model, insert, update, delete, etc, methods for all your entities.

RIA services does its querying with two methods, one is the model collection name (Employees), you would bind to that. After that you would populate that collection using the Load method on the context.

RIA services also includes some out of the box helper controls. One of them being the data pager control. As this control works on IQueryable in your model, it does true server side paging.

RIA services takes advantage of the Data annotation syntax, similar to AJAX and MVC. This is a general direction in Microsoft which is great. We can implement this once in our common code and get validation for free on almost any client type.

RequestAuthenication and RequiresRoles is a nice way to control method calls using forms authentication on your server side.

 

Successfully Administering and Running Microsoft Visual Studio Team System Team Foundation Server 2008/2010

Turns out the install success rate for TFS 2008 was quite low. Looks like we were lucky on that front. You can never rename a team project, that's why Microsoft uses codenames for projects!

TFS performance comes from hard disk speed. Disk read / writes in SQL is where the real hit in TFS is. Try to split the databases out.

It's recommended that you only use a single team project for your company. Use iterations and areas to break down projects. Microsoft has a single team project for "Office" and all the projects (Word, Excel etc) are all in that project. The main reasons behind this is Work items can't be moved between team projects, the out of box reports work on a single team project.

Microsoft suggests to run TFS in a virtual environment. Benefits come from snapshot and disaster recovery.

When splitting the tiers up, it is suggested to have sql and SSRS and web tier on the other.

Read the online notes, LOADS AND LOADS of tips on performance. Legend!

TFS 2010 installation has been massively simplified. Project collections now allow moving of everything to different to servers. All the new features in TFS 2010 are only available in VS 2010.

 

Enhancing the design-time experience for Microsoft Silverlight 3

This session is talking about creating controls that are blend friendly. That appear in Blend and are easy to design. When creating controls that would call data or webservices, check if in design time and return sample data in place of live data.

Overide ArrangeOverride and MeasureOverride to run code on resize or move in design time. Use Dependecy Properties to make control properties available in Blend. Standard meta attribute like Description, Category, DisplayName, PropertyOrder, NumberRange, NumberIncrement, NumberFormat, EditorBrowsable will get used by Blend to breakdown the placement of the property.

With all this extra design time code, you can get a large code base. It is suggested you break out the design time code into a new dll. If you name that DLL *.Design.dll, VS and Blend will go looking for it and use them on the fly. If you name it *.Expression.Design.dll only blend will use it. If you name it *.VisualStudio.Design.dll, visual studio will automatically pick it up. In this new dll, add reference to Microsoft.Windows.Design to get access to the extension support.

Next topic was Behaviours. Blend has a bunch of behaviours out of the box and there is an option in blend to pull down more. Behaviours is a very easy way to wrap up a small set of functionality you would want to apply to any control.

To build your own behaviour, inherit from Behaviour in System.Windows.Interactivity. If you want to limit it to a type, then pop that in the generic type. AssicatedObject stores what object the behaviour is being attached to.

 

Debugging Microsoft ASP.NET and Other Microsoft .NET Production Issues with WinDbg and Microsoft Visual Studio .NET 2010

This session is a favourite of mine, been to it for a few years and has always been very informative. This year it has been updated to take advantage of the VS 2010 feature that allows easy opening and diagnosing memory dump files. In Visual Studio 2010 the Parallel Stacks window shows a diagram of where every thread is and its stack in a lovely chart, giving you loads of insight into what's going on with your website.

As always Tess got down and dirty with some WinDb.exe action. WinDb.exe is a low level debugging tool that is not aware of .NET. So to give it some knowledge of .NET you need to first load sos.dll. You do this by running the following command:

.loadby sos mscorwks

Once you have done this you can walk through some of the CLR using an array of commands. Some of the interesting things you can do is say:

!dumpheap -stat

This will list all of the objects on the heap. From here you can drill into the objects for more information by using

!do (memoryaddress)

Once you find something that you believe should have been GC'ed, you can run the following command:

!gcroot (memoryaddress)

To give you a tree of what the GC is holding on to! This is so immense!

If you forget what the commands are, you can easily load up a command file that gives a handy context menu on the right using:

.cmdtree c:\debuggers\cmdtree.txt

You can get the cmdtree.txt from Tess's website.

Debug Diag Tool can capture memory dumps on the fly or look for requests holding. Its produced by the IIS team and can be easily downloaded and installed.

Tinyget is another handy little tool to stress test websites. Means you can prep your website before taking a memory dump.

 

All You Needed to Know about Microsoft SQL Server 2008 Failover Clustering

I know nothing about failover clustering so this was going to be a sure win! Interestingly clustering only helps with machine fail over, not database failure.When a node does fail over, any application will fail to connect while the next node comes online. So you need to put retry code in to keep your application running.

SQL server 2008 clustering works better with Windows Server 2008 R2 clustering. So the first thing you would do is setup Windows Server Clustering. Once you have done that you can then install the SQL Server clustering installation on your primary machine. You would then add all the other machines using the Add Node option in the setup wizard.

Sexy feature of clustering is that you can rollout patches and service packs without any downtime by applying the patches to each node, one at a time. This gives you good uptime as there is no need to take your application offline. You can even do full version upgrades, so moving from SQL 2005 to SQL 2008 with only a min or two's worth of downtime which is just the database script upgrading to SQL 2008.

Labels: , , ,

Wednesday, November 11, 2009

Tim Jeanes - TechEd 2009 - Day 3

DEV310 - Software Testing with Microsoft Visual Studio Team System 2010: Part2, Making It Real

TFS can be configured to fine degree to specify how much data is captured whilst the tester is executing manual tests (event log, intellitrace, system information, test impact, video recorder, etc. can be turned on and off, and various settings tweaked for each of them). This could be very useful in avoiding massive bloat. Worth noting is that if you're not careful you'll capture all activity in all applications - not just the application being tested.

I'm so looking forward to Intellitrace - the ability to wind forwards and back through the code whilst debugging is going to change everything. TFS also allows you to enable this whilst the testers are testing, so the developers can replay the code execution later.

(When using Intellitrace, it's important to access websites by machine name, rather than localhost, otherwise the recording won't work correctly.)

TFS has a new feature called Test Impact Analysis. This records which lines of code are executed by each test case. This means that when you start testing against a new build, it will highlight which tests should be rerun. If you change something that isn't managed code (such as html), you can manually record which tests this impacts.

You can specify configuration information for manual tests - for example you can list the browsers that the test should be run against. The tester will see this as multiple tests - one for each configuration.

TFS has hierarchical work items - you can break tests down into test cases, and link the tests to work items that are user stories, for example.

You can specify test steps for a test case: you can be as granular as you like with this (either "create an account" or "click on account, click register, type your name", etc.). Granular steps are a pain to write, but mean you can track more accurately where the test fails.

You can put parameters in the test steps by prefixing with an @ (e.g. "type @FullName"). You can then specify lists of values for these parameters that will then appear as multiple iterations of the same test. The test script tool that the tester will send these values to the application being tested with a single button click, avoiding typos.

Test steps can be marked as verification steps. You can specify the expected result, and this will oblige the tester to mark whether this step succeeded or failed.

When running tests, the tester has the option to record their actions. These can be played back later (in whole or in part) to speed things up when they redo the test later. This works across different iterations of the same test.

These action recordings can then be converted into Coded UI Tests. The code it generates is nice and clean - each granular step becomes a separate method in the generated code. It's impressive stuff - if a step involves clicking a link on a webpage, and you later change that link's id, the test will still try pretty hard to find the right link, by name or by inner text, etc.

The Coded UI Tests can (and in most cases should) be tweaked by the developer to add assertions that match the verification steps in the original manual test. There's a tool that allows you to pick DOM elements directly from the browser, and add assertions as to what properties those elements should have in order for your test to pass.

When specifying build processes in TFS, you can have it deploy your application onto a virtual environment of a number of virtual machines in a virtual environment - one VM to host the application and another to host the database, for example. All automated tests (including the the Coded UI Tests) can then be run automatically against these. As the Coded UI Tests actually open a browser to run them, these can also be run on VMs in the virtual environment, testing various browsers on various OSes.

Damn, this is good!

DEV304 - Extend Your Web Server - What's New in IIS and the Microsoft Web Platform

The Web Platform Installer is a smart new tool that makes it very easy to get a web server up and running very quickly. It can install anything you choose from a massive range of products: all you need from Microsoft (.NET, IIS, etc.), plus a whole bunch of third party items (DasBlog, DotNetNuke, Wordpress, etc.). It can be used again later to fetch new tools and add-ons.

Server Core is a cut-down version of Windows that has a minimal GUI (just a command line interface) - almost all configuration has to be done remotely. (You have to explicitly turn on remote configuration for IIS though, as you'd expect.) .NET is now available on this platform. They've removed some components (WPF, of course) to keep the overall size of Server Core minimal, but ASP.NET is there.

IIS 7.5 supports secure FTP! At last!

IIS has a URL rewriter that can transform nice-looking URLs (example.com/user/foo) to the ones your application requires (example.com/user.aspx?username=foo). This gives you a nice way to smarten up the URLs of an existing application without having to change the source code of the application. The tool to set this up is pretty cute: you only have to give an example URL and it will figure out the regex to use to create the transformation.

WIA202 - Microsoft Silverlight 3 - What's in it for Developers?

Now that more devices that access websites are using H.264/AAC encoding, it's good to see that Silverlight now supports this too, though DRM isn't currently supported (which is either a good or a bad thing, depending who you are.).

GPU acceleration is supported now (though you have to opt-in for your application: it's not turned on by default). This doesn't include 3D support just yet, though it should be out in a later version. The setting EnableCacheVisualisation will highlight the parts of your Silverlight app that will benefit from GPU acceleration.

However, you can simulate 3D using Silverlight's perspective 3D function. This uses plane projection transforms to give the 3D effect.

On animation, they've built in a bunch of easing functions - the ones available will look very familiar if you use jQuery's easing plug-in.

Pixel shaders can add effects to the visual display. There are two out-of-the-box ones: a drop shadow and a blur. You can write your own if you're not afraid of HLSL. As an easier alternative, you can use the ones made for WPF, available at wpffx.codeplex.com. Only one effect can be applied at a time, so if you want a drop shadow and a blur, you have to fudge it by containing your control in a canvas, then applying one effect to the canvas and one to the control itself.

You can take a snapshot of Silverlight controls as bitmaps. This can be handy to show reflection effects, a ghosted item whilst dragging and dropping, or for more reliable printing.

Silverlight now has a messaging system that allows multiple instances of Silverlight to communicate with one another. They don't have to be in the same browser window, nor even the same browser: an instance of IE can communicate with an instance of Firefox.

DEV304 - Deep Dive Into Developing Line-of-Business Applications Running in the Cloud

Windows Azure is great at supporting massively scalable applications, but there are a number of concerns that will need to be addressed for most such line-of-business applications.

The first is Multi-Tenancy. If your application is to branded by multiple companies, it's important that tenant A cannot access tenant B's data. If you're using Azure tables for your data storage, you can easily create a new table for each tenant. Access to these are very fast, and having separate tables ensures each tenant's data is held entirely separately.

If you're using SQL Azure you could create a new database for each tenant. Though this would be a maintenance nightmare for the DBA in traditional computing, Azure takes away this overhead: creating a new database is quick and hassle-free.

Another issue is customisation and extensibility: each customer might want to skin the UI according to their brand; they may also need to alter the business logic within their slice of the application. This lends itself well to Azure tables: as the table can hold entirely heterogeneous data, adding additional properties to rows is no problem. The developer only has to remember to handle to case where the value is missing on older records.

DEV207 - How Microsoft Does It: Internal Use of TFS and Team System for Software Development

Microsoft have made extensive use of their own tools to develop their software, so they've learnt a lot along the way.

One interesting point they raised was that by deferring fixing bugs, they never got round to it. Though it's more fun to be developing new code, the bugs mount up like the debt you don't want to think of. They found a much better way was always to fix all known bugs before moving on to the next implementation task. (A nice new feature of TFS is that you can set it up so that it will reject a check-in that doesn't compile or pass all unit tests.)

Obviously Microsoft is a far bigger company than we are, so the scale of their development doesn't compare that well with us. However, it was interesting to note that they prefer to form Feature Crews: a team of no more than five developers and five testers, dedicated to producing a single feature over 6-10 weeks. This size compares well with the sort of development team that we find works as well.

They've dropped their policy of 70% code coverage for unit tests, as they found that having too many led them to spending too much time maintaining unit tests and not enough time developing features. Instead, they try to target the most common usage path of the application. Interesting: we're currently wrestling with the balance of test coverage versus maintenance overhead.

Tech Ed 2009 - Wednesday - Neil Bostrom

Get Virtualised with Microsoft System Centre Essentials

This is the first interactive session I've been to this year. As we are currently running Virtual Server at work, I thought this would be a good session to find out where Microsoft are going with virtualisation. I was very impressed with System Essentials.It has all the management of your network in one place. Very powerful support for virtualisation right out of the box.

The best feature they showed is a three step wizard for virtualising a physical machine to a virtual machine. It will take across everything from the physical machine using bit copy. This will save us loads of hassle when moving existing boxes into Virtual Server. The manager is pretty clever when it comes to knowing which vm's will run on which host based on hard disk, memory etc.

 

Extend Your Web Server: What's New in Internet Information Server (IIS) and the Microsoft Web Platform

David Lowe is running this session covering the new features in IIS 7.5 that comes with Windows Server 2008 R2. He started out by covering the web platform installer. It allows you to setup your server quickly and easily. It has third party products built into it allowing you to install everything you need to run on your web server. These third party tool are like WordPress and Umbraco.

The new MMC interface now works over HTTP allowing you to easily mange your server from anywhere. Nice!

With Windows Server 2008 R2 now support ASP.NET on Server Core. This means you can have your web server with only command line to reduce your attack surface. The server core commands look scary as! 

Something shocking he mentioned while chatting about this is that Window Server R2 is only 64bit!!!!

IIS extensions are now fully supported in managed applications in Microsoft.Web.Administration.

They have a whole bunch of extensions out of the box, giving you loads of feautes like FTP, Media Server, Routing, Advanced Logging and even database explorer.

Secure FTP now comes out of the box in R2. You can assign SSL certificates on your FTP site to make sure all passwords are encrypted.

IIS Application request routing module is a very nice tool for setting up web farms just using IIS. It can also be used for for forwarding sites to different boxes.

Must remember to use the Search Engine optimization tool more. Very sexy!

Never seen Best Practices analysis before and looks like a good tool to get security tips on your websites.

 

Microsoft Visual Studio Team Foundation Server 2010: Becoming productive in 30 minutes

This session is showing how to install TFS 2010 in 30 minutes. Microsoft have done some good work to make the install / configuration much easier. There is now 64x version of TFS, meaning it can make full use of uber servers.

TFS now comes with a lovely administration console to tweak settings for TFS.

In the new version of TFS they have introduced a concept of ProjectCollection. This means that projects are grouped into collections. These collections can be moved from server to server.

This session did a really nice demo of TFS with VS 2010. See all the little changes on day to day work.

 

Microsoft Silverlight 3: What's in it for developers

Silverlight 3 introduced GPU acceleration support. This feature is opt in meaning that you have to say that your silverlight project will use GPU acceleration. It only works in full screen mode on the mac due to a restriction on the mac OS. LIttle tip bit that came out was that stretching pixels in hard cpu work. GPU acceleration really helps with that. To enable GPU acceleration, you pop EnableGPUAcceleration in the html object tag. You also have to enable any elements in your xaml that you want to be accelerated by adding CacheMode="BitmapCache". They have a debug tag called EnableCacheVisualisation which will show you what parts of your silverlight application can be improved by GPU Acceleration.

Silverlight 3 also got video streaming improvements and more easing functions. IIS Smooth streaming is now out of the box supported.

Another nice feature that came in Silverlight 3 is local messaging support. This means we can send messages from multiple Silverlight controls. It does it via a shared memory concept. This communication can work across tabs and browsers.

A quick demo was shown using the new navigation APIs. This includes support for links to jump to parts of your Silverlight application. Using this enables the use of the back and forward button. Very nice!

The power behind the navigation support comes from the frame controls. You can also control the Uri that gets produced using UriMappers on the frame control.

They have a nice system to break out any referenced assemblies in to smaller packages meaning the client is more likely to cache common references. This reduced your xap files.

Sexy feature when using out of browser support in Silverlight 3 is CheckAndDownloadUpdateAsync(). This method will check the version of the local app and allow you to get the latest version before running the normal application.

 

Securing Microsoft Silverlight: Knowing the Enemy

As we are producing Silverlight applications, I thought I should get on top of this issue. The first topic was an easy one which is packet sniffing. Simple solution, use SSL!

To protect data, keep it on the server and send down only the XAML. Isolated storage is not secured. Its just sitting around on the disk, open to the user.

 

How Microsoft Does It: Internal Use of Team Foundation Server and Microsoft Visual Studio

This session was done last year but has been updated with new stats. All the advice in this session is from Microsoft experience internally. One of the big problems they suffered with Visual Studio development was putting off bugs. Doing all the work upfront and fixing the bugs at the end. This caused a big problem as they took longer to fix later.

The speaker showed a nice end to end storyboard of the visual studio 2010, 3 years ago. Very interesting to see where they went with it and how they associated them with work items.

Tuesday, November 10, 2009

Tim Jeanes - TechEd 2009 - Day 2

Coincidentally in Berlin for the 20th anniversary of the fall of the Berlin wall.

20 years ago today a bunch of us got away without doing our German homework because our teacher was in such a good mood.

WIA203 - Streaming With IIS and Windows Media Services

IIS and WMS are two quite separate products, and which you use depends largely on your specific requirements and your available architecture.

WMS2008 sits best on its own server, separate from the server holding the original content. It then does all its own smart caching, automatically dumping the less-viewed content from the cache, and preserving the media that is more on demand. A raft of recent improvements mean it can handle more than twice as many simultaneous connections as could WMS2003.

On the other hand, if you're just delivering media from a web server, IIS Media Services may well be enough. It's a freely-available downloadable add-on to IIS that gives a bunch of features to improve media delivery.

It has some nice settings to give more efficient usage of the available bandwidth. Typically users only watch 20% of the video media that they actually download, so the other 80% is wasted bandwidth that the media provider still has to pay for. You can configure IIS to treat media and other data files differently dependent on their file type. Typically you'd set it to download as fast as possible for the first 5-20 seconds, then drop to a proportion of the required bit rate for the rest of the video. This gives a quick spike on bandwidth initially followed by a constant rate of just enough to ensure the user doesn't experience any delays in the media they're viewing.

If you need to control how and what the end user watches (for example, they may need to be forced to watch an advert before they can see the main content), you can control how they can stream and download the content. In the playlist you define on the server you can enable or disable the user's ability to skip videos or to seek within them. The URLs of the actual videos being sent are automatically obfuscated to ensure that the URL of the main content isn't guessable by the end user.

Smooth streaming is now supported. This monitors the user's bandwidth and adapts the stream quality in almost-real time. This is achieved by splitting the original media into many 2-4 second long chunks and encoding each at several different qualities. IIS then delivers each chunk in succession, switching between qualities as the bandwidth allows. As it's monitored continuously, if the user experiences a temporary glitch, their video returns to its better quality within a few moments.

Encoding at many bitrates is very CPU intensive, so doing this live is much harder work. Currently Microsoft has no offering that can manage this in real-time, so you'd need some third party hardware to do the hard work.

DEV317 - Agile Patterns: Agile Estimation

I suck at estimating timescales, and Getting Better At It has been on my list of tasks for the next period on every performance review I've had for the past 11 years. I'm glad to hear that I'm not alone - this was a very well-attended session where we commiserated together about how estimates (which are by definition based on incomplete information) becomes a hard deadline.

A key concept is the Cone Of Uncertainty. This measures how inaccurate your estimates are as time goes by.

ConeOfUncertainty

A couple of points to note are that initial project estimates, based on the loosest information, are often out by a factor of 4 - either too long or too short; even when you have a clear idea of what the customer wants, estimates are still out by a factor of 2. Also notably, we can't guess with 100% accuracy when the software will be delivered until it's totally complete. Asked for a show of hands, the vast majority of the room said they've woken in the morning expecting to release a product today, end then haven't: even on the last day, we can't guess how long we've got left.

As we can't avoid this, it's better to be honest about it and work with it.

User stories

The customer specifies a list of stories - items that must be in the product.

Planning poker

From user stories, we make estimates of difficulty. This is to do with priorities - not time estimation. The estimates should be based on difficulty. Planning poker cards represent order of magnitude compared with baseline. Each person places their estimates simultaneously. Disagreements lead to discussions.

Take a baseline: a task you know well (such as a login page), then compare the complexity of each other item with this (this is twice as hard as that, etc.).

Story points

Break down stories into smaller pieces (that will later become your individual work items). Give each a number of story points: a number of units of relative size - multiples of say a notional hour or day, depending on the scale of the project. These aren't still really your time estimates as you don't really know yet how quickly you're going to work through them.

Play planning poker again to decide the number of story points for each item.

Product backlog

The list of work items becomes your backlog, each with an estimate attached to it. At this point you meet with the customer to prioritise the items in the backlog.

Velocity

Developers commit to a number of story points for the first sprint. At the end of the first sprint, the number of completed story points is your velocity.

TFS has some plugins that help to monitor and calculate this.

Re-estimation

After each sprint, the customer may add more stories to the backlog, and can re-prioritise the backlog. The developers may also add bugs to the backlog.

Each sprint gives you increasingly accurate predictions of future delivery.

DEV303 - Source Code Management with Microsoft Visual Studio 2010

Branching has been improved: it's now a more first-class part of TFS. Branches have permissions associated with them to allow or prevent certain users from creating or merging branches. It's a faster process now as no files have to be checked out (or even copied to your local machine) to create a branch in TFS.

You can create private branches by branching code and setting permissions on the new branch. Also, there's a graphical tool that shows how branches have been created and how they relate to one another. This is interactive, so branches can be created or merged from here.

When viewing the history of a file or project, changesets that were branches can be drilled into to show the changes that happened in the main branch, prior to it being branched into your project.

Changesets can be tracked visually through branches and merges - you can show where a change has been migrated between branches - either on the hierarchical diagram of changesets or on a timeline.

It was always a pain to merge changes where there are conflicting file renames. Fortunately this has been significantly cleaned up. The conflict is now correctly detected and you're told the original name of the file as well as the two conflicting new names.

Similar fixes have been implemented for the problem areas of moving files, making edits within renamed files, renaming folders, etc.

These version model changes make it a whole lot clearer what's going on if you view the history for a file that's been renamed - even if it's renamed to the same name as a file that was previously deleted. If you're using VS2005/2008 with TFS2010, you'll need a patch to ensure this works.

Rollbacks are now a proper feature - you don't have to resort to the command line power tool to do these. Also they now properly rollback merges (previously it would forget that the merge now hadn't taken place, so re-merging would be very difficult).

A single TFS server can now have multiple Team Project Collections. These are sets of projects with their own permissions, meaning that different teams can use the same TFS installation without access to one another's projects.

WIA403 - Tips and Tricks for Building High Performance Web Applications and Sites

This was a fast-paced session with lots of quick examples. I've not listed them all, but a few of them are here:

Simplifying the CSS can improve the performance of the browser. Basically speaking, the simpler the CSS rule, the more performant it will be. Also, using "ul > li" to specify immediate children is much more efficient than catching all descendents with "ul li".

Javascript performance can be improved by making sure you use variables that are as local as possible. Similarly, the more local the property on an object (i.e. on the object itself or on its prototype), the quicker it can be accessed.

A powerful feature of javascript is that it can evaluate and execute code in strings at runtime. However, this can be very slow. It's often used to run code with setTimeout - it's much better to use an anonymous function instead.

Getters and setters for properties are generally good programming practice. However, as javascript isn't compiled, the slight overhead of traversing the setter/ getter method can double the time taken to access the property.

The length property on a string or array is not fast: it has to count all the items in it. Thus saying for (var i = 0; i < myArray.length; i++) is very inefficient. Caching the length in a variable makes it faster. Or if you're iterating over DOM elements, you can use the firstChild and nextSibling properties instead: for (var el = this.firstChild; el != null; el = el.nextSibling)

Having a lot of cases in a switch statement is also slow: each case has to be checked in turn. A sneaky trick that can be employed is to build an array of functions instead, and just call the appropriate one. Obviously this doesn't apply in every situation.

Rather than having many small images on your page, it's better to have one large image and specify offsets for each one you want to show. SpriteMe is a handy tool that will make this mosaic for you: it scans your page for images and then glues them all together.

Doloto is a tool that monitors your javascript and tracks which functions are called. It can then generate javascript library files that are dynamically loaded in the page as they are needed. It ensures that the most common functions are available immediately and others are loaded later. A quick demo on Google maps showed that it could reduce its bandwidth spent on retrieving javascript files by 89%. Impressive stuff!

Microsoft Expression SuperPreview can render a page as it would appear in different versions of different browsers (Firefox, IE6, 7, and 8). It will show the results side-by-side (or even superimposed on one another) and even spot the differences for you (and highlight them).

WIA305 - What's New in ASP.NET MVC

A new feature is to be able to compose a view out of multiple calls to actions. This appears as <% Html.RenderAction() %>. This is a little smarter than Html.RenderPartial as you can perform any necessary business logic where it belongs in the correct controller instead of having to shoe-horn it in elsewhere.

Areas allow you to group parts of your application into logical groups. It behaves like a folder in your project, with subfolders for models, views and controllers. There's also an AreaRegistration class that registers routes for the Area. Global.asax has to call AreaRegistration.RegisterAllAreas to activate these. By default, the area appears at the top level of its views' URLs.

An exception is thrown at runtime if Areas contain controllers with the same name. This can be circumvented by specifying which namespace should be used for the controller, when you register the route.

Working with ASP.NET MVC 1, I've felt frustrated with the validation. A new model allows you to specify your validation once and have it applied to each layer. Your validation rules can be specified by using Data Annotation attributes, or in an XML file, or elsewhere if you write your own provider. To enable client-side validation you only need to include a couple of Microsoft javascript libraries and add the helper method <% Html.EnableClientValidation(); %> to your page. If you invent your own validation rules, you'll also have to write your own javascript version of that validation logic - the built in framework passes the rules as JSON that you can intercept on the client.

There are some new helper methods - Html.Display, Html.DisplayFor, Html.DisplayForModel, HtmlEditor, Html.EditorFor and Html.EditorForModel. Given a model, these display read-only or input field versions of all the fields on the model. You can define your own templates for these - either by type (to implement your own date picker, for example), or by giving a name of a partial view that renders an editor for the whole model. This respects inheritance too: if no template has been defined for Employee, it will fall back to the template defined for Person.

A nice little tweak is that by default, JsonResult now won't allow HTTP GETs. This dodges a cross-site scripting vulnerability, though you can override it if you really want.

Tech Ed 2009 - Tuesday - Neil Bostrom

Microsoft Web Platform Overview

This talk was a nice overview of the features of the web platform installer. The web platform installer is an easy way to install all the standard components required for web development. The tool allows quick installation of mvc, asp.net, IIS (and optional components), SQL Server and loads of third party plugins like DotNetNuke and Umbraco. The tool is always up to date with the latest components by reading an RSS feed during load.

The tool is extremely handy for setting up new servers. You can choose all the components required by your application and get them all installed at once. This reducing setting up new servers to a simple job of just running the web platform installer.


Source Control Management with Microsoft Visual Studio Team Foundation Server 2010

This was the second talk I went to by Brian Harry. The talk first covered the chnages coming in branching and history for TFS 2010.

Loads of work has been done in this area to make branching a first class citazen. You can now fully control permissions of branches allowing you to have private branches.

They have also stepped up the history support, allowing you to see history across branches. This drill down support will be extremely handy as I always want to see the full history of an item.

Another sexy feature introduced are some graphs to show where are changesets are in the branch structure. With these graphs we can easily see where a changeset has been merged to and when. These aren't just charts, you can drag a change set to another branch and it will pop up the merge window with the information preloaded.

Brian also spoke about the version changes to TFS, with the introduction of a basic, standard and advanced wizard. Basic just has the source control items, not including any sharepoint documents or reports. Standard contains the usual install and advanced is a fully customization version allowing you to easily put your sql into an existing clustor, or you a web farm if you already have one.

The second half of this talk covered a new product that Microsoft have just purchased which gives full TFS support in Eclipse. Eclipse is an open source Java IDE. This is a great move forward for Microsoft as it extends to use of TFS to all developers. The plugin has all the features of the VS version of TFS. Even including support for building your java code in team build and even more interesting, full support for junit to feedback into the build status. I spoke to the speaker after the session to ask about xcode support. Currently apple has no provider model for xcode but if that changes they will hop on that.


Tips and Tricks for building high performance web applications and sites

This session was little bitty, some interesting nuggets. The first area was CSS. Giorgio went into details about the performance of CSS in browsers. The key advice here was to keep your selectors simple and targetted.

Javacript was the next topic. A whole bunch of javacript samples were shown that demostrated what to avoid. DOM access is very expensive in javascript. An interesting point that was raised during one of the samples was that if you miss out the var on assigning a new vairable, it will become a global variable. Thats bad news!


Cracking Open Kerberos: Understanding how active directory knows who you are

An excellent session on the ins and outs of how Kerbours does its job. Kerberous is a ticket based authenication system. It has two types of tickets, a ticket granting ticket and a service ticket. First thing that happens is that the client computer encrypts the time using your password. Sends that to the key to AD, which verifies the password by decrypting the time using the known password. Once that passes, the server creates a ticket granting ticket encrypted using the built in password of the server. This ticket only lasts 10 hours but allows you access to the TGS (Ticket Granting Service). The client computer then asks the TGS for a service ticket for a service (i.e printer service). The AD will return a new ticket to the client and the service. Now the client computer can use that service ticket to talk to the service.

Monday, November 09, 2009

Tim Jeanes - TechEd 2009 - Day 1

ARC201 - The Windows Azure Platform - How and When to Use It

The pricing for Windows Azure has been out for a little while, but I found it unclear exactly what you were paying for - especially for hosting web applications. What does $0.12 per hour really mean? That's CPU hours, surely? Well no - it turns out it's bad news: that's $0.12 per hour per web role or per worker role. And an hour is an hour: for every hour that goes by on your watch, you pay $0.12 for each role you have running.

That makes things very expensive very quickly if you're hosting a small web site, but this isn't what Azure is for. This gets very cheap very quickly if you're running Facebook.

This is also a great pricing model if your site has massive spikes in usage: ticket sales apps or tax service sites for example. You can basically turn your site off for 95% of the time, then spin up 20 servers in the space of a minute just before the tickets go on sale and drop back to zero just after they're sold out.

I think a more common route into using Azure will be for data storage. Data storage of Blob data is only $0.15 per GB per month (plus $0.15 per GB downloaded). If you've got a whole load of video or other media that you want to be constantly available, Azure could be a great dumping ground for it all: it's reliably available and isn't going to use up your bandwidth, and you can leave the rest of your app running outside Azure in an environment you're more comfortable with.

Using SQL in the cloud, however, is far more expensive than raw data storage: though prices are measured by the size of your database ($9.99 per month for a database up to 1GB; $99.99 for up to 10GB), what you're really paying for is all the functionality (and CPU time) of having a structured database. You've got to know that your database is going to be used regularly before this becomes worthwhile.

 

DAT204 - What's new in Microsoft SQL Azure

Oh thank goodness! Microsoft listened to everyone last year saying that accessing SQL Azure via REST was a major pain in the body part. We also now have everything you'd expect in a proper fully-functioning SQL database: gone are the days where you just have basic tables with string keys, partitioned by rules you specify.

Instead now you access SQL Azure via SQL Server Management Studio: you just connect to [database name].database.windows.net, and very nearly all the functionality you'd expect from SQL Server is right there. There are a few exceptions: for example you can't use the USE keyword, as they can't guarantee your two databases are on the same physical server (though they are on the same logical server) - in fact they probably won't be, to optimise performance.

A server is only a logical server - not a physical box. It's a unit of authority and a unit of geo-location, so this should be what drives the point at which you make another server. As you get to choose which datacentre's hosting your database, you can also get best performance by ensuring your application runs in the same datacentre as your database. And as data transfer within a datacentre is free, so you won't be eating into that $0.15 per GB download cost.

There's a nice improvement to security too: your server now comes with a firewall, allowing you to restrict access to by IP address range. Somewhat novelly, you can't create accounts called sa, admin or root, just because those are most commonly hacked.

(And next week they'll announce that you can replicate from Azure to SQL Server but shh! - you're not allowed to know that yet!)

 

DEV-GEN - Developer General Session

The Developer General Sessions at TechEd are attended by pretty much every developer at the conference, so they're held in the largest hall on site.

"Wi-fi access is not available in this hall"," said the sign outside. A shame, because the main part of the talk was really quite dull, very high-level and largely uninformative. "VS2010's made of WPF!" My cat knew that - and I haven't even got a cat.

Tech Ed 2009 - Monday - Neil Bostrom

What's new in Windows Communication Foundation 4.0

I've recently started using WCF in anger on one of our projects, which is why this session interested me. The first major feature that was discussed is the new routing functionality. This feature allows a single end point to publish multiple services & protocols. This single end point will forward the request to the correct service based on path / content type.

The next sexy feature was auto discovery of services. At the moment you always need to put the connection information into your client. By using discovery we can just create a service of a contract and it will go and attempt to discover if anything is hosting that contract. This means you can easily do load balancing just by putting up more services offering that contract and the client will just pick the first one it finds. Bonus feature with this is that you can take services out without affecting the clients.

To publish your service as discoverable, you create a UdpDiscoveryEndpoint. This will use UDP packets to make the service available to clients.

Christian suggested that using ChannelFactory.CreateChannel() is better than storing the configuration in the web.config. I hadn't really got around to looking into using CreateChannel, will definitely look at making use of it.

WCF 4.0 has better REST support including JSON-P features.


Microsoft Visual Studio Team Suite 2010: A lap around the developer and tester experience

Brian Harry did a good whirlwind tour of the new features coming in TFS and VS 2010. He started off with showing the new tester experience. They have a new product coming out called Lab Management. This product is built for testers to do all types of testing including, manual testing and integration testing.

The sexy part of the TFS / Labs is that during test runs, it will record all sorts of information about what is happening. Screenshots, a live video is recorded, all the system information, historic debugging saved. All of this just happens in the background and get attached to any bugs you create. This gives the developers the maximum information about the bugs.

Another cute feature is that the tests know the code that they cover. This means when a new build gets produced, it can notify the tester of the manual test that would need to be run that have been effected by that build.

Brian moved on to TFS features next. Showing nice branch visualization, allowing you to track changes across branches and show where changes have been applied. Brian also mentioned that the conflict resolution has been improved which should be interesting to try.

TFS finally has rollback support built into the interface, meaning I don't have to use the command line to rollback changes.

One feature I'm really excited about is that TFS Build has been updated to use Workflow foundation on top of MSBuild to make building build scripts much easier. This should give us loads more power in the build script. Its been really painful up to this point building complex build scripts.

The new version of the TFS product will have three versions, Basic, Standard and Advanced. Basic gives you the base level functionality of source control. Standard includes the share point sites and sql reporting features. Advanced version allows you to tweak pretty much any configuration so you can run on a web farm or an existing SQL cluster etc.

The server version of TFS will finally come with an admin console meaning you don't have to use command line to make changes. It also includes lots of improvements for scaling out your installation by load balancing any layer, web, business or data access.

Microsoft Office System 2010 Client Overview
This was a bit of an eye opener for me as I knew very little of the latest office buzz. The guys presentation just steamed straight into new features and changes in the office products. The interesting part of this was they kinda just assumed you knew there was online versions of all the office products. That work in any browser on any platform! OMG! I was not aware of this!

It seemed like the web version would use Silverlight if its installed to give a sexier interaction but will fall back to a html / javascript version if you don't have SL installed. JS version was still fully featured and worked in every major browser!



Tech Ed Keynote

Usually the keynotes at Tech Ed are sexy and sell harder than the Berlin wall! However today's keynote seems to just be a big flop. A few random videos with no content and a long speech that seems to lack any information. The small bits I did get out of the keynote was BitLocker to go, vague cloud computing. Even the exchange demo seems to just be some random side feature.