I just got a new Dell Latitude D810. It absolutely flies. However, I was getting very weird playback when playing certain WMV files (e.g. ECO is Child's Play
or the PDC sessions
). The playback was twinged with orange, and lime green, and other colors where I couldn't see the video. I felt like firing up some Jefferson Airplane to complete the effect.
To solve this, I ended up opening WMP9, going to Tools | Options | Performance and turned the Video acceleration setting down to None. Problem solved.
A version of LINQ that is compatible with the RTM version of VS2005 has been released on this page
This post doesn't contain any actual technical information. I'm making this post in order to help a Borland R∓D person to debug a very minor, but somewhat irritating problem with the Welcome Page. The problem pops up if you have a post that has <pre> and <code > tags in a post that gets summarized. Here's a sample.
procedure ScrollToLineCol(AMemo: TCustomMemo; Line, Col: integer);
if AMemo = nil then
// Find the character number of the line requested
CharNum := AMemo.Perform(EM_LINEINDEX, Line, 0);
// Make sure we keep the column in this line
Len := AMemo.Perform(EM_LINELENGTH, CharNum, 0);
if Len > Col then
AdjCol := Col
AdjCol := Len;
// Add the column to the caret position
AMemo.SelStart := CharNum + AdjCol;
AMemo.Perform(EM_ScrollCaret, 0, 0);
Hopefully, this makes it easy to debug now!
Borland is doing another 24 hours of Delphi
on October 24. I'm on from 9:50am-10:20am CDT that morning (for non-US readers, check the BDN article for time conversions)I'll be talking about migration considerations for Delphi 2006, including changes to VCL, DBX, and DataSnap. I'm excited about getting a chance to share some of the things that I've learned about Delphi 2006 and pass on my very favorable experience with that product. Hopefully you can come away from my talk with a few time saving tips when migrating your applications to Delphi 2006. Looking forward to seeing you online that day!
Since Allen Bauer spilled the beans that FastMM4 will be the default memory manager in DeXter
, we have switched over to FastMM4
for all of our applications. These applications are heavily threaded MIDAS/COM servers, with internal COM/DLL references. The switchover has been a real eye-opener. Using the debug version of FastMM in order to pinpoint memory leaks, we have found some (more than a couple, but less than "a lot" :)) places throughout all of the applications where we were forgetting to free memory. Another benefit is that the servers have been performing much better under stress than they used to.
In addition, Pierre has been absolutely fantastic. I've had several email exchanges with him, and he is always willing to take bug reports and get fixes, in addition to implementing feature requests. My company has donated to the FastMM project, and I am adding another donation from my pocket since I think this is such a great tool, and a worthwhile thing to support. I would highly recommend that you check this out, and if you find it useful, throw a donation his way!
As I mentioned in my previous post about LINQ and IDENTITY fields
, I was getting an exception when trying to update back to the DB. I have since tracked this down to my guess in that post where it was due to the way that I constructed the customer object via foreach. Using that technique, it seems that it keeps a DataReader open longer than needed, which results in this error. I have a workaround, so I can at least share this sample to show you how related tables with IDENTITY fields work. If you want to see the exception, replace the line where I assign the cust variable with a "foreach (var cust in query)".
Things to note in this sample:
- I am mixing and matching objects retrieved from the DB (cust) and locally created objects (o and od).
- You can assign the Order object to the customer either by setting o.Customer to point to the customer, or using cust.Orders.Add(o). The same holds true for adding OrderDetail to Order.
- IDENTITY fields are not initialized upon creation, but they are assigned the actual value that they got when getting inserted to the DB.
- You can link tables together by either using the object references or the actual data field. For example, when building the OrderDetail object, I use od.ProductID to set the value that will get written to the DB. I could also have constructed/received a Product object from the DB and assigned od.Product to do the linking via object reference.
- Linking between related tables is taken care of automatically, even when linked via IDENTITY fields
static void Main(string args)
Northwind db = new Northwind(
@"Data Source=(local);Initial Catalog=Northwind;Integrated Security=True");
db.Log = Console.Out;
var query = from c in db.Customers
where c.CustomerID == "ALFKI"
Customer cust = query.ToArray();
Order o = new Order();
o.Customer = cust;
o.Freight = 23;
Console.WriteLine("[PRE] Order.OrderID == " + o.OrderID);
OrderDetail od = new OrderDetail();
od.Order = o;
od.Quantity = 1;
od.ProductID = 1;
Console.WriteLine("[PRE] OrderDetail.OrderID == " + od.OrderID);
Console.WriteLine("[POST] Order.OrderID == " + o.OrderID);
Console.WriteLine("[POST] OrderDetail.OrderID == " + od.OrderID);
I was preparing a writeup on how LINQ handles IDENTITY fields, and there is some mixed news on this front. First off, the good news. The concept of how LINQ will deal with IDENTITY fields is quite solid. The goal is to have the DB return the new value for the IDENTITY field and then populate the object with that new value. Very slick. Back in Delphi 5, MIDAS introduced a property TDatasetProvider.ProviderOptions.poAutoRefresh that was supposed to do this same thing. However, it was never implemented, so it never worked. As a result, you had to use other options, like the one I outlined in my BDN article
. So LINQ has a definite advantage here.
The other really cool things about LINQ IDENTITY handling is that you can build up a graph of objects (i.e. Orders and OrderDetails), add the object to a Customer object, and when you do a SubmitChanges(), the IDENTITY fields are updated properly throughout the whole object graph, in addition to linking properly in the DB.
However, the bad news is that the DLINQ implementation is throwing a System.InvalidOperationException because "There is already an open DataReader associated with this Command which must be closed first.". If you are debugging the application and pause long enough before continuing, the data will still get written to the DB. However, I have not been able to get the application to work at all when just running it (i.e. Start Without Debugging). I believe it may be due to my mixed use of retrieving a customer from the DB using LINQ, and then creating Order and OrderDetail objects locally and adding them to the Customer object. I say this because doing a simple add of a Category object all by itself doesn't yield this exception. I also had an instance one time where only the Order was inserted, and not the OrderDetail, thereby invalidating the atomicity of the transaction. I plan on cleaning up the test case and submitting it to MS. Anyone know where this kind of feedback should go?
In the current version of DLINQ, you can only target MSSQL. The architecure seems to be extensible enough to allow for DBMS vendors to provide their own DLINQ assemblies so that when you write DLINQ queries, you will be allowed to communicate to a variety of back-ends. See System.Data.DLinq.Provider.ProviderContext for the class that you can use to descend from in order to get your own DBMS supported. Talking with MS, it appears that they are hoping that each and every vendor will provide their own assemblies. I think this is a mistake for the following reasons:
- First off, the current code in System.Data.DLinq.DataContext uses code in System.Data.DLinq.SqlClient. It essentially forces the ProviderContext to be SqlClient. This will have to be cleaned up and made more generic before a vendor can deliver something.
- I believe it's time for MS to start providing tools that allow the developer to mix and match parts. Until MS does this, they will continue to develop solutions that work well with MSSQL, but will miss out on what other DBMSes can do. They need to take the lead by providing other DBMS versions so they can see where the holes in their current approach are. If they wait for the vendors, it will most likely be too late to change the code to do what needs to be done to support 3rd party DBs. It's a Catch-22. DB vendors will wait to write their assemblies until MS cleans up the code (see point 1 above), and see that doing this will yield positive returns. When they finally do this, it may be too late to have the DLINQ architecture changed to accomodate whatever specific hooks may be required to get first-class support of the 3rd party DB.
- It doesn't appear that there is any registration/management code in place to allow for you to simply say: "I'd like to use MSSQL (or Oracle, or InterBase)". It seems that this would be resolved by which assemblies you reference in your code. But how well will this work when you want to target multiple DBs in your application?
- Finally, the competition (i.e. Borland) already provides low-level multi-DB code in several cases, and has for years: BDE, DBExpress, and Borland Data Providers (BDP). If a company with a development team that is a minute fraction of MS's development team can deliver this, there's no reason MS couldn't do the same.
There is some conflicting information out there about what you need to run LINQ. This link
says that the C# LINQ Tech Preview will only work with VS.NET 2005 Release Candidate (RC). What is troubling is when you download the MSI file from that page, it is the same exact MSI file as the one distributed on Disc 4 at PDC. That file would only install on VS.NET 2005 Beta 2.
To make matters worse, this link tells us that the VB LINQ Tech Preview from the web site will only work with VS.NET 2005 RC (as does The LINQ Project Page). If you install RC and then the VB LINQ Tech Preview, you'll find that the files that get installed are newer than the C# version. However, the VB package is missing the System.Data.DLinq assembly.
I imagine updates will be coming to sort all of this out soon, but just thought you should be aware of the current state of things. To make things simple for now, use Beta 2 and C#.
Edited on 9/17 to add: Looks like MS updated the page to say that C# only works with Beta 2.
.NET Language Integrated Query (LINQ)
is here. We've seen a couple of teasers over the past few months, but after seeing this stuff in-depth for a few days, I have to admit that I love what I see. So much so, that I created a new feed
for it because I plan on writing about LINQ quite a bit. For now, the generic description will have to do. It's a way to write query-like statements that can operate on sets, XML, data, and more using one syntax. Furthermore, this is all available in your native programming language (C#, VB, and others will follow soon, I'm sure), which means rich type information at design-time, and it's strongly-typed so you'll see errors at compile time. It's in beta right now, and there are all sorts of known warts, but the promise of a useful and valuable tool is excellent.
Stay tuned for more information on LINQ!