A Great 3ds Max Resource

Posted 20 January 2012, 22:40 | by | | Perma-link

While I don't do anywhere as much with 3ds Max as I should, I do like to keep learning about it and playing with it every now and again.

One of the most useful resources I've come across in a long time is Jamie Gwilliam's Blog: Jamie's Jewels. Jamie's recently been running a series of "Quick Tips" on his blog which I've found to be incredibly useful, and he also occasionally runs competitions where you can win the odd bit of Autodesk/3ds Max branded or related gear.

His last one was just a call for shots of 3ds Max jewel cases or software boxes, and luckly I had a few very old ones sitting around in the loft, having switched to digital distribution years ago. I only got around to sending a shot of the boxes to the day before the deadline, so was very happy to recieve a very nice iPhone case this week Smile

Anyway, follow @3dsJamiesJewels over on twitter now to find out all sorts of things about this great app.

Doodle Grouper: A How-To

Posted 28 November 2011, 23:00 | by | | Perma-link

Doodle Grouper LogoWhile it's clearly perfectly understandable to me how Doodle Grouper actually works, I'm occasionally reminded that it's not as intuitive for others as I'd like, so here's a little "How-To" guide, that covers most of the features that currently exist in 1.0.0.7.

Adding Users to a List

The easiest way to add a user to a list is to use the "Add to list..." action the plugin adds to the actions on the User Icon:
The "Add to list..." Action

This brings up the "Add User To List" Dialog:
Add User To A List

Here you can add the user to an existing List by selecting it from the list, or create a new List, press "Add a new list" and then select it and press OK.

The Lists you create with this plugin appear in the User Lists tab of Seesmic:
User Lists

You can display or hide the columns by clicking on them - the order they appear in the main Seesmic space is determined by the order you show them.

Modifying a List

Once you've displayed a list, you can modify it through the column settings, selecting "Modify this list..."
"Modify this list..."

Then on the "Modify List" dialog, you can rename the list, add more users manually (it's Case Sensitive so be careful doing it this way), delete the entire list or remove a user by selecting them from the list and pressing "Update":
Modify The List

The remove function only removes one user at a time at the moment.

Advanced Functions

On the Settings pane for the Doodle Grouper plugin, you can import and export your lists for example for use on a different computer. While you can export the file to pretty much any location you like, note that you can only import a file from within your "My Documents" folder.

Doodle Grouper Settings

The format of the xml is as follows:

<?xml version="1.0" encoding="utf-8"?>
<ArrayOfUserList xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
                 xmlns:xsd="http://www.w3.org/2001/XMLSchema">
  <UserList>
    <Name>List Name 1</Name>
    <Users>
      <string>User Name 1</string>
      <string>User Name 2</string>
    </Users>
  </UserList>
  <UserList>
    <Name>List Name 2</Name>
    <Users>
      <string>User Name 3</string>
      <string>User Name 4</string>
    </Users>
  </UserList>
</ArrayOfUserList>

I tend to leave Seesmic running, and have encountered memory issues when running with too many lists, so I added the ability to throttle the number of updates that the plugin keeps in each List.

If you "Enable logging", the plugin will add more verbose details into the Seesmic log file to tell you what it's doing.

At the time of writing (1.0.0.7) the "Enable grouping of retweets" doesn't currently work.

Filed under: Plug-ins, Seesmic

Bricks.StackExchange.com is now live

Posted 16 November 2011, 15:43 | by | | Perma-link

It's finally here: After 12 months of other people defining, and committing to support it, one of the latest StackExchange sites to enter public beta is LEGO Answers - Bricks.StackExchange.com*, the place to ask all your LEGO and compatible brick building questions.

This is the first time I've been really actively involved in the private beta of one of these sites - I was in the WebMasters one enough to fulfil my obligations and earn the Beta badge, but not as much as this time, so here are my thoughts on the process.

Remote collaboration is hard

Working with a group of others remotely, on a site that doesn't encourage "chatter" is hard - Chat is there, but it's not that heavily used at the moment - when what you really want to do is sit down with a few drinks and discuss some of the issues: I had to remind myself on a few occasions not to take things personally, especially as there is a large group of users for whom English isn't a first language - they're doing a lot better than I would do if the site were in German or French Wink, I just hope I didn't cause any offence either...

Moderation requires effort - let us help!

One of the features of StackExchange sites is that as users gain reputation, they gain access to tools and features on the site that give them more say in the running of the site - as is pointed out on the FAQs:

At the high end of this reputation spectrum there is little difference between users with high reputation and moderators. That is very much intentional. We don’t run LEGO® - Stack Exchange. The community does.

But also, during the beta phase, the reputation requirements are lowered so that the actual users rather than CHAOS moderators do most of the day to day running of the site.

During Private Beta, the reputation requirements are very low, around 1000 rep gets you access to moderation tools - by the end of private beta, we had three or four users who were just over this threshold and helping out Dori and Robert (the two mod's I've really seen around, no disrespect to Geoff, Emmett or Rebecca). Suddenly the site goes into Public Beta, and the threshold's instantly raised to 2000 rep, and a load of new users join the site (that's a good thing Grin) but suddenly, it's left to Dori and Robert to manage the influx (it probably wasn't that big) of flags, suggested edits, etc., and my edits suddenly need approval again - I mean, I've proved myself already haven't I?

I guess part of this would have been resolved if the influx of users had actually been asking questions that people could answer and vote on - then the reputation would have risen fast enough to counter this - as it is, after nearly two weeks in public beta, I've nearly regained access to the moderation tools Wink.

Conclusion

We need your help - come and ask questions - come and give better answers - and don't be put off by thinking your question is silly, or someone's edited your answer - we're here to help, and craft the best resource for LEGO Q&A's there is - we really need to increase our daily question rate for example.

As some examples, I've had some great answers to building graveyard rooms for Heroica games and building a roof with a 60º pitch, others have been trying to work out how to invert the direction of bricks so that the studs are touching.

* The name may well change - The LEGO Group understandably don't want "LEGO" in the title. Vote for your choice over on the meta question: Site may not be described as LEGO Answers - call for alternative titles

Filed under: Lego, StackExhange, Websites

SharePoint People Search

Posted 11 October 2011, 16:29 | by | | Perma-link

There are a number of useful blog posts out there on the web that talk you through setting up SharePoint's People search correctly however the following were increadibly useful to me:

Corey Roth's "How to set up People Search in SharePoint 2010" was the most helpful for me - however I think there's something different about the instance of SharePoint I'm running:

When Corey suggests giving the Default Content Access account rights in the User Profile Service Application, I didn't have the option of "Retrieve People Data for Search Crawlers", however giving it access to "User Profile Application" and "User Profile Service Application Proxy" seems to work quite nicely*.

Also, as all my browsers are set up to be British English for preference over US English, or just English, we weren't seeing any of the Nickname or Phonetic search features that are included in SharePoint 2010.

Thanks to Kristof Kowalski's post "SharePoint Server 2010 Phonetic and Nickname Search" and the results of his support request, we were able to get this working for all our British speaking users Wink, and now searching for "Benjamin" will bring me back as "Ben".

* Having finally installed Service Pack 1, and the June, August and October Cumulative Updates the permissions list now matches Corey's blog.

Filed under: Search, SharePoint

SharePoint Database Moves - Unknown SQL Exception 53

Posted 05 October 2011, 16:21 | by | | Perma-link

This is one of those jobs that looks so simple on the outside, but can lead you down a massive rabbit hole if you're not too careful:

We're in the process of changing an instance of SharePoint from a true "Single Server" setup to having the databases on a separate SQL Server, and there's plenty of documentation about that, starting with: Move all databases

We'd also followed the recommendations in Harden SQL Server for SharePoint Environments but after we'd moved the Content Database to the new server and attached it in SQL, all attempts to "Add a content database" in Central Admin resulted in errors such as:

Cannot connect to database master at SQL server at SqlServer2. The database might not exist, or the current user does not have permission to connect to it.

Digging into the ULS logs we found the following Critical errors:

Unknown SQL Exception 53 occurred. Additional error information from SQL Server is included below.  A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server)

It turns out that sometimes the SQL Configuration Manager needs a little bit of a helping hand, and you need start the Old School "SQL Server Client Network Utility" with the command (note the 8.3 format of the name):

cliconfg.exe

On the Alias tab of this app, add the Aliases you added in the SQL Configuration Manager, and you should be good to go.

Filed under: Fixes, SharePoint

First Thougths on Windows 8

Posted 17 September 2011, 23:51 | by | | Perma-link

I'm drafting this post in WordPad*, on the basis that I don't have much else installed on Windows 8 - I could have written it directly into the browser, but this feels like a better place to start - I've got a lot to say, and may need to make a small mini-series out of it.

Setting the scene a bit, I was encouraged to write this after I pulled Scott Hanselman into a G+ discussion about other peoples views on the Windows 8 Developer Preview. We were challenged by him to stop wasting our precious keystrokes and blog about it, and others had previously been riled by my statement that we're not "typical" users any more, and wanted to know my thoughts… All the while I'm trying to "remember that this is a Developer Preview, it's not Beta, or even Alpha".

Don't wast your keystrokes - Scott Hanselman

Whilst I've not yet spent all that long playing with it, I've read and watched a few things around the web this week, and so have been forming thoughts for a little while, and here's my opinion, and my guesses as to some possible justifications - note that I am guessing here, and your guesses might be different and more accurate.

One of the big discussions we'd been having was "Who is Metro aimed at?". It's clearly a "Touch First" UI, and indeed preferring a tablet form factor over anything else - using a mouse in there really does feel like a poor runners up choice. My stance was that these days more "typical" users are using tablet devices (mostly with an i at the front) and MS are clearly hoping to get in on that action, and they are using them for content consumption rather than content creation (whatever that content may be, from graphics, music or video through to word documents, spreadsheets, and applications). In that sort of situation, were a user is consuming some media, perhaps along side watching the telly, a full screen app held at not quite arms length on a touchable interface makes a lot of sense. For those of us still creating content, generally this will require the more formal interfaces of mouse and keyboard**.

Now that I've been using it, my initial thoughts are somewhat confused - but I hope that this is because of the early nature of things - a lot of the apps behave inconsistently - for example when I first saw the Metro start screen I thought: "I hope that scrolls left and right as I scroll the mouse wheel up and down", and it does, so I am happy navigating around in there. But so many of the other apps don't do this at the moment (Socialite, Tweet@rama and News/RSS to name three) - which worries me: is this something that Devs are going to have to implement in every app? Shouldn't this be the "native" behaviour of the "Grid Application"?

I know Metro is targeting the Tablet form factor, and I don't yet know how much the frameworks are hiding the fact that the user is using a mouse or a finger for navigating around, but there seems to be a lot of "mystery meat" navigation or buttons, and the usual trick of hovering over them doesn't work, because nothing pops up to tell you what the name of the button is - presumably because you can't hover with a finger.

Take Metro IE for example - what does the button on the right do:

Pin and WTF?

Open a new tab? Open a new URL? Apparently it allows you to search for content, or switch to Desktop IE:

New? Nope. Find or Switch

One of the things I've been dabbling with over the last few years is XNA - this isn't in the Windows 8 Dev Preview, and hasn't been mentioned on the technology plans (other than in the negative) - I believe the reason for this is the move to DirectX 11 - XNA at the moment targets the implementations of DirectX on the XBox and Windows Phone platforms (9) - and there are at least two factors at work here:

  1. Different Hardware - Historically, new versions of DirectX have needed new hardware to fully support them, this isn't an option with the XBox or Phones, and adding a new version of XNA doesn't work on the XBox will cause confusion I'm sure.
  2. Different Teams and Timescales - XNA has often lagged the framework updates a little bit.

This was exacerbated for me at least with the demo at //build/ where they show building a controller aware application, and to do this, they had to build a C++ wrapper - this raises a much higher barrier to entry than just adding a Controller object to your class.

This is a shame, because I've always held the opinion that the easiest way to get kids interested in programming is to get them writing games - it's how I started, and how a lot of my friends started. But then along came the consoles, and they were hard to develop for unless you had the funds for a dev kit ($10,000 for the original XBox I believe); XNA changed all that, and it would be a shame to lose it again.

It's possible that these wrappers will move into the main framework (or sit officially in the Microsoft namespace rather than System) at some point in the not too distant future.

Overall, I think I'll like it, the system is generally snappy and responsive (but this hardware is no slouch), and I'm putting the glitches there down to early drivers and odd hardware rather than anything else, but I'm not sure how much time I'll spend in Metro.

There's more to come, but that will do for now, as I was up too late last night playing with Windows 8.

* That was a very bad choice, at this point in time, on this computer, the rendering of the text in the document, and more importantly, the location of the caret in relation to it is way off - I've not used it in so long I can't say if this is a feature of the app or the OS.

Messy text in WordPad

** Which now neatly sets the scene for Voice Recognition to come of age, and we'll all be plugging a Kinect into our PCs to enable the touch features even without a touch screen. April 2012 at the latest, along with my flying car.

Filed under: Opinion, Windows 8

Interesting finds around PowerShell Mandatory Parameters

Posted 20 June 2011, 15:47 | by | | Perma-link

I noticed an interesting thing today while attempting to make some parameters on a PowerShell script required.

As I'm just starting to get the hang of PowerShell, you'll have to bear with my script-kiddie tendencies, however, here's the background:

I'm creating a script that will take the following parameters (can you tell I'm still working with TFS?):

  • [string] WorkItemType
  • [string] ProjectsToUse
  • [switch] ExportWitd

I wanted to make the WorkItemType parameter required, and searching around the internet led me to the following code:

[string] $WorkItemType = $(throw 'workItemType is required')

Which does indeed "work" if you're happy with exceptions being thrown if you don't set it:

workItemType is required
At C:\Scripts\UpdateTFSWorItems.ps1:17 char:37
+     [string] $WorkItemType = $(throw <<<<  "workItemType is required"),
    + CategoryInfo          : OperationStopped: (workItemType is required:String) [], RuntimeException
    + FullyQualifiedErrorId : workItemType is required

And it also doesn't help if more than one parameter is required (you only get the first error), and get-help doesn't help:

PS C:\Scripts> get-help .\UpdateTFSWorItems.ps1
UpdateTFSWorItems.ps1 [[-WorkItemType] <String>] [[-ProjectsToUse] <String>] [-ExportWitd]

As you can see, get-help thinks that WorkItemType is optional (the square brackets around the name and the type).

The actual answer came in the form of the parameter attribute:

[parameter(Mandatory = $true)]
[string] $WorkItemType

Now when I run the script without the WorkItemType parameter I get prompted to supply the required parameters as a read prompt, rather than nasty exceptions:

cmdlet UpdateTFSWorItems.ps1 at command pipeline position 1
Supply values for the following parameters:
WorkItemType:

However, this also has some fairly major consequences:

It converts your script into an "Advanced Function", as some of the documentation on TechNet for this states:

[T]o be recognized as an advanced function (rather than a simple function), a function must have either the CmdletBinding attribute or the Parameter attribute, or both.

There doesn't appear to be a way to apply one or other of these attributes and not be recognised as an advanced function.

Because your script is now "advanced", get-help is a bit different:

PS C:\Scripts> get-help .\UpdateTFSWorItems.ps1
UpdateTFSWorItems.ps1 [-WorkItemType] <String> [[-ProjectsToUse] <String>] [-ExportWitd]
[-Verbose] [-Debug] [-ErrorAction <ActionPreference>] [-WarningAction <ActionPreference>]
[-ErrorVariable <String>] [-WarningVariable <String>] [-OutVariable <String>]
[-OutBuffer <Int32>]

Where did all those extra parameters come from? Your function now supports all the "Common Parameters", this is more clearly stated if you add some help documentation to your script:

<#
.Parameter WorkItemType
    Required. The name of the work item type you want to update in TFS.
.Parameter ProjectsToUse
    New or Legacy, defaults to New.
#>

Calling get-help now results in the following syntax line being auto-generated:

SYNTAX
    C:\Scripts\UpdateTFSWorItems.ps1 [-WorkItemType] <String> [[-ProjectsToUse] <String>]
    [-ExportWitd] [<CommonParameters>]

In general though I think this is a bonus, as it allows me to call the write-verbose and write-debug methods to get additional output and breakpoints.

If you're going to go to the effort of adding parameter documentation, you might as well also supply the HelpMessage parameter of the Parameter attribute:

[parameter(Mandatory = $true, HelpMessage = "The name of the work item type you want to update in TFS.")]
[string] $WorkItemType

Which then allows the user to get some help as they are prompted:

cmdlet UpdateTFSWorItems.ps1 at command pipeline position 1
Supply values for the following parameters:
(Type !? for Help.)
WorkItemType: !?
The name of the work item type you want to update in TFS
WorkItemType:

Gotcha with Write-Verbose

Write-host appears to take an array of objects and write a string representation of them out to the screen, which can be (ab)used in "interesting" ways:

write-host "Exporting WIT Definition from TFS Project" $selectedProjects[$i]

Results in output like:

Exporting WIT Definition from TFS Project Generic

Write-verbose (and indeed write-debug) explicitly only take a single string however, and calling them in the same way results in a nasty exception to that effect, so make sure you concatenate them:

write-verbose ("Project folder " + $selectedProjects[$i] + " already exists")

Which is slightly tedious.

Filed under: PowerShell, TFS

LinqPad and Entity Framework

Posted 31 May 2011, 16:36 | by | | Perma-link

I'm a massive fan of LINQPad - it's much more lightweight than SQL Management Studio if you want to muck around with your data, and allows me to work against SQL with C#, rather than having to remember how to use cursors and the like when I've got some otherwise tedious data re-mapping to do.

That said, I'd never really pointed it at an Entity Framework EDM before, but I'd been wondering about the best way to tidy up the code for the tag clouds in the albums section of this site, so thought that I should see what I could do with the entity model as it stood, without resorting to stored procs.

Firing up the "Add Connection" wizard, I selected "Entity Framework" from the "Use typed data context from your own assembly", selected the website library, selected the (only) EDM from the library, and confirmed the provider.

LINQPad then created the data context correctly, so I pulled up a query to start working against it, only to get the following error message:

An assembly with the same identity 'System.Data.Entity, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' has already been imported. Try removing one of the duplicate references.

This was slightly confusing - obviously both my project and LINQPad are referencing the System.Data.Entity namespace, and if I remove the reference from my project, I won't be able to build it. How do I remove it from LINQPad then?

It turns out that I don't have to do either. The clue (for me) lay in the notes section of the "Using LINQPad with Entity Framework" page:

If your assembly references other assemblies, LINQPad will pick up these, too, providing they live in the same folder.

Note the bit in bold from the original page - it would appear that my reference to the System.Data.Entity library was set to "Copy Local". Setting that to false and restarting LINQPad removed the error message, allowing me to clean up the tag code somewhat.

Filed under: Entity Framework, Fixes, LINQ, Tools

Killing off TFS 2005: Part 5

Posted 09 May 2011, 17:20 | by | | Perma-link

Just when you thought it was safe to enter source control

Apparently, we weren't finished with the upgrade to Team Foundation Server 2005. We've upgraded, apparently people are successfully (after a couple of false starts*) checking stuff out, working on it, and checking it back in, so it's all good… Or is it?

The full series

Part 5: In Which Errors Are Noted

You may recall that in Part 3 I had to fiddle around with a few report definitions and import a new work item into the process template. What I forgot to note down in the blog was the fact that I also had to tweak the names of a couple of fields in one of the work items to successfully import - I probably forgot to write about it because it's something I often had to do when mucking around with them.

My long suffering Network Manager pointed out to me that the Application Log on the TFS server Newyork was being filled with the following error:

TF53010: The following error has occurred in a Team Foundation component or extension:
Date (UTC): 03/05/2011 17:57:56
Machine: NEWYORK
Application Domain: TfsJobAgent.exe
Assembly: Microsoft.TeamFoundation.Framework.Server, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a; v2.0.50727
Service Host:
Process Details:
  Process Name: TFSJobAgent
  Process Id: 4008
  Thread Id: 1524

Detailed Message:
  Adapter: Microsoft.TeamFoundation.WorkItemTracking.WorkItemTrackingWarehouseAdapter
  Team Project Collection: Legacy

TF221161: There are conflicting definitions of the following work item fields in the warehouse: System.ExternalLinkCount<->System.ExternalLinkCount (DefaultCollection); System.HyperLinkCount<->System.HyperLinkCount (DefaultCollection); System.AttachedFileCount<->System.AttachedFileCount (DefaultCollection). These conflicting definitions come from different project collections. Work items from project collection Legacy will not be updated until the conflict is resolved. You can resolve the conflict by making the definitions of the field the same in all project collections, or marking the field as non-reportable in the project collection Legacy. For more information, see the Microsoft Web site (http://go.microsoft.com/fwlink/?LinkId=160807).

The link at the end of the error gives you pretty much all the information you need to resolve the issue, but it bears repeating so that I don't forget it again:

Start by doing a quick compare and contrast of the two systems using the Visual Studio 2010 command prompt and the Work Item Template Admin tool (witadmin):

First, call ListFields on both collections to compare their settings:

witadmin listfields /collection:http://newyork:8080/tfs/Legacy /n:System.ExternalLinkCount

This resulted in:

Field: System.ExternalLinkCount
Name: ExternalLinkCount
Type: Integer
Use: System
Field Indexed: False
Reportable As: measure:sum

Calling ListFields with the original collection name resulted in:

Field: System.ExternalLinkCount
Name: External Link Count
Type: Integer
Use: System Field
Indexed: False
Reportable As: measure:sum

Note the spaces in the Name field - this is what's causing the problem - the fields both have the same reference names, but different display names, these need to be in synch to stop the errors.

Seeing as the fields where currently being reported on in the original project collection, it makes sense to use the version with spaces:

witadmin changefield /collection:http://newyork:8080/tfs/Legacy /n:System.ExternalLinkCount /name:"External Link Count"

Note the quotes around the new name value. Finally validate that the field was correctly updated by re-issuing the ListFields command - I won't encourage you to add the optional /noprompt parameter, but it's there if you don't want to confirm each step.

Repeat this for each field listed in the error, and you should now be good to go.

* The false starts were those expected: The developers were still working with VS 2008, and so were getting a permissions issue when trying to connect to TFS 2010. This is resolved by:

  1. Installing the relevant Forward Compatibility Update (2005 and 2008)
  2. When connecting to the server, rather than entering just the server name in the "Name" field, supplying the entire path to the project collection:
    http://newyork:8080/tfs/Legacy

Filed under: TFS

Killing off TFS 2005: Part 4

Posted 27 April 2011, 12:12 | by | | Perma-link

Yay for short working weeks! I started writing this post last week, and then Easter came along and this week's even shorter as we've been given an extra bank holiday so we can all go and watch the Royal Wedding, which will be fun for all concerned I'm sure - hope the weather holds, but it's not looking great at the moment…

Last time we tried Killing off TFS 2005

We'd started the dry run of our TFS Integration process on the test server TFS10. This had started migrating some of the work items, but also had issues with a missing Work Item Type. I then left it running, and went on holiday…

The full series

State of the nation

I got back to discover that the server had been restarted at some point during the last week (thanks for the mandatory Windows Updates on the servers chaps Wink) but after some digging discovered that the process had crashed with an "Out Of Memory" exception. I suppose trying to migrate part of an 11GB TFS instance to another database, on the same server was a bit much to ask. Luckly, TFS10 is a virtual machine, so a quick shut down/re-configure/restart we were back in operation with 6GB of RAM. and I tried again.

I imported the missing Work Item Template to the target project, and then all 2,930 Work Items imported without any problems.

However, I had a number of issues with the version control migration, many of which in the end, I've not been able to resolve:

  1. The workspace names are a combination of the server name, the current instance GUID and your Session Names, and this must be less than 64 characters or errors will occur - The work arounds for this is are:
    1. Change the WorkSpaceRoot application setting in the MigrationToolServers.config file in the TFS Integration Tools install directory to be as short as possible
    2. Ensure that the "Name" fields of the Left and Right Version Control Session Sources are short but meaningful.
  2. Related to 1 - we've got some historical check-ins to folders that resulted in a total path length of over 260 characters, for various reasons TFS can't cope with this, and other than carrying out the steps in 1, there's nothing you can do about it.
  3. Regardless of what the tool states, you can edit the Version Control filter pairs after the configuration has been saved to the database if you're happy to edit the XML directly, just remember to leave off the trailing "/" - however this will result in the Discovery and Analysis phases running again.
  4. We've already performed some sort of merge from different projects into one of the projects we're trying to merge, and this caused an issue because the TFS Integration Tool couldn't find a workspace mapping for these old, no longer existing projects - I was hopeful that this could be resolved by adding a new filter pair to try and map them in manually, however I've not been able to confirm this completely.
  5. Various Null Reference Exceptions stating that the Path could not be null that caused the migration to stop.

So, in the end, I've had to pull the plug on this experiment, and live with a seperate Team Project Collection for the legacy code base - I'll keep telling myself that "It's probably ok, we're not going to be working on them side-by-side" and things like that, but in the end I'm a little sad that it didn't work.

If you've got a fairly simple, or small history (or you're happy to drop a good portion of it), you know that file paths are short enough then I think the Team Foundation Server Integration Platform is a great tool, but as they warn you throughout their documentation, it really should only be used if you absolutely have to, and you can't live with an upgrade (a note on that - For some reason, on the live TFS2010 box, I actually had to restart TfsJobAgent to get the upgrade to complete, as the new Project Collection wasn't created until I restarted it).

Filed under: TFS