Syncing Visual Studio Database Project With Entity Framework

Or: Quirks of Using Entity Designer Database Generation Power Pack

If you’re searching for a way to synchronise a “Visual Studio Database Project” (also known as “SQL Server Database Project” or “TSData Project”) with an Entity Framework model, you’ve probably already encountered the “Entity Designer Database Generation Power Pack”.

With model first development, the most basic use case is to synchronise the database project with the model. If you are trying to achieve this, you need to be aware of conventions that have been baked into the tool:

  1. Your “.edmx” file must have the same name as the “.dbproj” (i.e. database project name).
    • It is just the file name of the model that is important.
    • Neither the “Namespace” or “Entity Container Name” properties of the model need to match the file name. In fact, you will not be able to use names containing full-stops/periods (“.”) in these properties, but you may well require that for the file name if you’ve named your database project in this fashion.
    • It is just the file name of the database project that is important too. The name of the
  2. Your database project must lie in the root of the solution structure. i.e. do not put your database project in a solution folder.

Footnote

There won’t be any more development of the “Entity Designer Database Generation Power Pack”, as the functionality is being provided as part of the “SQL Server Developer Tools”, code named Juneau. This would be the better toolset to use to satisfy this use case.

Advertisements
Posted in Uncategorized | Leave a comment

New UK Electricity Pylons?

This video on new pylon designs misses the option of putting the cables underground (or underwater) instead. This would have a higher up front cost but lead to much better looking countryside and probably lower total cost of ownership.

The lower total cost of ownership was the argument Dr Liam Fox (Defence Secretary) used in his recent objection to a scheme to place new pylons in his Somerset constituency.

If we do have to endure yet more pylons, I’d vote for the sail design, “Plexus”, by Amanda Levete Architects & Arup, shown in the attached photo.

plexus-pylo_660[1]

Posted in Environment | Leave a comment

A SPDY end to web site content sharding?

With the advent of Google’s SPDY improvements to the HTTP protocol, could we see an end in sight to the practice of sharding content on web sites?

In the news today (from The Register) Google report a 15% increase in speed when using SPDY to communicate to their web services from Chrome browsers. The SPDY technology seems to establish a single TCP session in which multiple HTTP like requests are managed in an efficient manner. It means, for instance, that multiple requests can be processed concurrently rather than the two (or six in recent browsers) per domain limit.

The practice of content sharding is used, in part, to achieve a similar effect. It allows browsers to believe they’re downloading from different servers and so they can initiate more concurrent connections. Another benefit is in reducing the payload of cookies by using domains the cookies haven’t been set for.

SPDY should take care of all of this for us. And, in fact, using separate domains to serve images, CSS and Javascript will perform worse with SPDY as there will be multiple TCP sessions established. So, assuming this technology becomes more widely available on the server side, we should probably start selectively sharding content on the basis of the capabilities of the user agent.

Technorati Tags: ,,,
Posted in Computers and Internet | 1 Comment

EU Cookies Directive & eCommerce Analytics

Article 66 of this set of directives, http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:L:2009:337:0011:0036:En:PDF, from the Official Journal of the European Union states that:

“Third parties may wish to store information on the equipment  of a user, or gain  access to information  already stored, for a number of purposes, ranging  from the legitimate  (such as certain  types of cookies)  to those involving unwarranted intrusion into the private sphere (such as spy­ware or viruses). It is therefore of paramount importance that users be provided with clear and comprehensive information when engaging in any activity which could result in such storage or gaining of access. The methods of providing information and offering the right to refuse should be as user-friendly as possible. Exceptions to the obligation to provide information and offer the right to refuse should be limited to those situations where the technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user. Where it is technically possible and effective, in accordance with the relevant provisions of Directive 95/46/EC, the user’s consent to processing may be expressed by using the appropriate settings of a browser or other application. The enforcement of these requirements should be made more effective by way of enhanced powers granted to the relevant national authorities.”

It’s not clear to me what “third parties” constitutes, but I assume it does not include the owner of the website the user is visiting. So Google Analytics / Ominiture Site Catalyst would count as a third party.

It’s possible you could interpret the use of an eCommerce site as a user explicitly requesting the service of browsing products for the purpose of purchasing. If you could, then the storage of analytics tags could be interpreted as being necessary to provide an effective browse and purchase experience through process of analysis and improvement. Then it might pass as being the “legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user”. It does feel rather tenuous though.

This is all personal opinion and is not legal advice. Please seek somewhere else to pin the blame if you get taken to court for not asking your customers for permission to store cookies!

Posted in Computers and Internet | 2 Comments

Internet Explorer 6 – Dying Days?

IE 6 was once the most popular browser on the market. This despite its lack of adherence to standards and its poor rendering performance. These days it’s on the way out, and no-one in IT is mourning it. The graph here shows the trend of IE 6’s market share over the last year and a half (source: http://www.w3counter.com/globalstats.php).

The story’s not that simple though. Corporations still persist in forcing their employees to use the browser, doubling its share during working hours, and Microsoft have still committed to supporting it for some years yet. Problems with an attack on Google via IE6 security flaws in December 2009 don’t seem to have galvanised companies into ditching it, but maybe Google’s subsequent withdrawal of support for IE 6 in GMail and other Google Apps “later in 2010” will accelerate this.

Posted in Uncategorized | Leave a comment

ASP.NET HTTP 400 Bad Request Exception

Someone on my project pointed me at this useful article from Alois Kraus about the 400 “Bad Request” exception coming from ASP.NET. In particular, it cleared up the way that length checking is performed on a URL in an incoming request.

On my project we are relying on a key from FAST ImPulse, a search engine for eCommerce apps. This key is included in a URL. In certain deep search scenarios, e.g. lots of filtering on facets of a product, this key can become very long. This has been resulting in the “Bad Request” messages. It turns out that there is a difference between including this key in the path of the URL and as a query string parameter. For example:

http://www.example.com/{key}/results.aspx

http://www.example.com/results.aspx?key={key}

If {key} is close to 260 characters, the first URL will be rejected by ASP.NET whereas the second will be accepted. ASP.NET only allows the path part of the URL to reach 260 characters. As the article mentions, ASP.NET 4 allows this limit to be changed.

Posted in Uncategorized | Leave a comment

Setting up a keyboard shortcut for View History in Visual Studio using TFS

Note – this post refers to and is tested with Visual Studio Team System 2008.

If you’re using Visual Studio and Team Foundation Server, you’ll probably be viewing the history of files in your solution fairly regularly. Unfortunately there’s no keyboard shortcut set up for that. If you wish to set one up, here’s what to do:

  1. Go to the “Tools” menu and select “Options” and then “Keyboard”
  2. In "Show commands containing:” type “TfsHistory” and select “File.TfsHistory” in the list.
  3. In “Press shortcut keys”, put in your desired shortcut. Take a note of anything appearing in “Shortcut currently used by” to ensure you don’t overwrite anything.
  4. Click “Assign”

In the screenshot below, I’m setting up “Ctrl+Shift+Alt+V” followed by “H” to be the short cut. Once I click “Assign”, I’m done.

Posted in Uncategorized | 2 Comments