WCF Data Services vs. WCF RIA Services – Making the Right Choice

Windows Communication Foundation (WCF) provides all the support you need to build distributed service-oriented data access solutions. You can certainly work with WCF directly to create custom services and expose data from an Entity Data Model (EDM) with the ADO.NET Entity Framework, or from any other data access layer (DAL) of your choosing. To take this “raw” approach, you need to start with the basics, or what is commonly referred to as the ABC’s of WCF: Addresses, Bindings, and Contracts. You must create service, operation, and data contracts, and then configure your service model with appropriate endpoint addresses and compatible bindings to be reachable by clients. Services are usually stateless, so you must also handle client-side change tracking and multi-user conflict resolution entirely on your own. The learning curve can be quite steep, after which you will still need to expend a great deal of effort to make it work.

Alternatively, you can turn to one of the two later technologies that Microsoft has built on top of WCF. These are WCF Data Services and WCF RIA Services, and they represent two very different approaches for building data-oriented services. Both provide abstractions that shield you from many underlying WCF particulars, so you get to spend more time focusing on your application and less time on plumbing. For one thing, you don’t need to code WCF contracts or manage change tracking on the client; all that gets done for you. With WCF RIA Services (and Silverlight), you don’t even need to create and update service references; Visual Studio generates code automatically via a special link that keeps your client and WCF RIA Services projects in sync at all times.

Both WCF Data Services and WCF RIA Services can solve many of the same problems, so it is only natural to question which one to use. The answer extends a bit beyond the standard “it depends on your scenario” response, since WCF RIA Services offers a lot more than just data access functionality. It also features client-side self-tracking entities, client-side validation, automatic server-to-client code generation, and more. In this blog post, I’ll discuss both platforms at a high level to help guide you in making the right choice.

WCF Data Services

Microsoft designed WCF Data Services as a thin layer over Entity Framework that exposes data-centric services to client applications. Out of the box, you can quickly build WCF Data Services over an Entity Data Model with virtually no effort. Custom providers for WCF Data Services for data sources other than Entity Framework are available; however, considerable additional effort is required to implement them. You can think of WCF Data Services as universal Web Services built just for data, although it can be easily extended with custom service operation methods. The platform is based on the industry standards of Representational State Transfer (REST) and the Open Data Protocol (OData), which means that these services are consumable by virtually every type of client in the world.

REST provides a uniform interface for querying and updating data. It is based on HTTP, meaning that client requests are issued in the form of GET, POST, MERGE, and DELETE actions — standard verbs understood by all HTTP clients. Any REST query can be invoked with an HTTP GET request by expressing all the elements of the query in a properly formed Uniform Resource Identifier (a URI, which is a more general term than Uniform Resource Locator [URL]). You can even test the service with an ordinary browser; simply type the properly formed URI directly into the address bar and you will receive the Atom Publishing Protocol (AtomPub) response (an XML dialect very similar to the Really Simply Syndication [RSS] feed format). The POST, MERGE, and DELETE verbs correspond respectively to insert, update, and delete operations supported by the service, and the payload (parameters, data, and other metadata) for these operations is passed in HTTP headers.

Just as REST enables universal data access via HTTP, the Open Data Protocol (OData) establishes universal data structure via standard serialization formats (see http://www.odata.org). All clients can handle plain text formats such as JavaScript Object Notation (JSON) and of course XML, and so OData defines standard response formats based on both formats. JSON provides a compact structure suitable for many basic types of services, while XML forms the basis for the more verbose AtomPub feed format. AtomPub is the default serialization format in WCF Data Services, because it effectively leverages the hierarchical nature of XML to describe the rich structure of data and metadata in an Entity Data Model.

The WCF Data Services client libraries for Windows/WPF, ASP.NET, Silverlight, and Windows Phone 7 include a special LINQ provider, commonly known as LINQ to REST. This provider automatically translates client-side LINQ queries into an equivalent OData URI, meaning that you don’t need to learn the OData URI syntax if you are building Microsoft clients over WCF Data Services (just use good old LINQ). The client libraries automatically deserialize the AtomPub feed response from WCF Data Services into ready-to-use objects. They also provide a stateful context object that can track changes on the client for pushing updates back to the server, although your code needs to explicitly notify the context about objects as they are changed.

WCF RIA Services

WCF RIA Services is, well, richer than WCF Data Services (and also newer). Indeed, the R in RIA means rich, although the full TLA (Three-Letter Acronym) can stand for Rich Internet Application or Rich Interactive Application—depending on who you’re talking to. Since its earliest days, WCF RIA Services was designed to work best with Silverlight, although it now also supports OData, SOAP, and JSON to reach a wider range of clients as well. You can build WCF RIA Services over any data access layer, including Entity Framework and LINQ to SQL. You can also use Plain Old CLR Objects (POCOs), in which case you handle the persistence yourself using your data access technique of choice, even conventional ADO.NET (raw readers and command objects, or DataSets). In any case, you expose WCF RIA Services by coding domain service classes that support CRUD operations, as well as other custom service operations. You also maintain a special metadata class for each entity that auto-magically surfaces on the client for effortless end-to-end validation across the wire.

When WCF RIA Services is used with Silverlight, you don’t need to create and update service references; Visual Studio generates code automatically via a special link that keeps your client and WCF RIA Services projects in sync at all times. Like a service reference, this link binds the two projects together, only a WCF RIA Services link couples them much more tightly than an ordinary service reference does. Public changes on the service side are reflected automatically in corresponding classes on the client side every time you perform a build, so you never need to worry about working against an outdated proxy in the client project simply because you forgot to manually update a service reference.

The WCF RIA Services link greatly simplifies the n-tier pattern, and makes traditional n-tier development feel more like the client/server experience. With the link established, Visual Studio continuously regenerates the client-side proxies to match the domain services on each build. It also auto-generates client-side copies of shared application logic you define in the services project, simply by looking for classes you’ve defined in files named *.shared.cs, or *.shared.vb. The link enforces automatic client-side validation and keeps validation rules in sync between the the domain services and the client at all times. Furthermore, client-side entities are completely self-tracking; you do not need to manually notify the context object of every change to every entity, as you are required to do with the WCF Data Services client library.

Comparing WCF Data Services and WCF RIA Services

The following table summarizes several key differences between the two platforms.

WCF Data Services WCF RIA Services
Supported Clients Resource-based API, supports all clients via deep REST and OData support. Domain-based API, most tailored for use with Silverlight, but supports other clients via SOAP, JSON, and OData.
Supported Data Access Layers Targets EF. Other DALs are supported, but greater effort is required. Supports EF, LINQ to SQL, and POCO (custom persistence layer).
Client Development Requires you to notify the context for change tracking. Supports self-tracking entities, synchronized client/server logic, and much more (particularly with Silverlight).
Service Development Instant, code-less, extensible REST services out of the box (with EF); “free CRUD.” Requires you to code CRUD operations manually in domain service classes.

From this comparison, it looks like WCF RIA Services is more attractive for Silverlight clients than non-Silverlight clients—regardless of which data access layer is used. Conversely, it shows that WCF Data Services is more appropriate for use with Entity Framework than it is with other data access layers—regardless of which client is used. But let’s examine things in a bit more detail.

If your scenario uses EF on the back end and targets Silverlight on the front end, then you are in the best position. Both WCF frameworks pack a huge win over writing traditional WCF services “by hand.” Your decision at this point is based on whether you simply require services to provide data access (that is, you primarily need CRUD support), or if you are seeking to leverage additional benefits. Another consideration is whether you are targeting Silverlight as the client exclusively or not.

Of the two, WCF Data Services is relatively lightweight, and requires almost no effort to get up and running. So it’s the better choice if you primarily require data access functionality in your services, particularly if you want to keep your service open to non-Silverlight clients as well. WCF RIA Services is more robust, and offers numerous additional features. This makes it a very compelling choice for the development of rich client applications. Although it began as a platform almost exclusively designed for Silverlight, support is steadily emerging for other client platforms via OData, SOAP, and JSON, as well as self-tracking entity libraries now available for jQuery. However, it requires the effort of creating domain service classes to support CRUD operations.

Finally, both frameworks are extensible, and both can be secured by traditional authentication and authorization techniques. They are also both capable of integrating with the ASP.NET Membership provider for role management and personalization.

What if you are using neither Silverlight nor Entity Framework? Well, then your work will be cut out for you whatever choice you make. With WCF Data Services, you will need to implement either the Reflection or Streaming provider, or write your own custom provider. And with WCF RIA Services, you will not get to fully enjoy all the benefits of the framework, but you will still need to write domain services and metadata classes. After careful consideration, you may well conclude that neither choice is appropriate, and decide instead to stick with tried and true WCF services, coding your own service contracts, data contracts, binding configurations, change tracking, validations, and so on.

Summary

Both WCF Data Services and WCF RIA Services can represent huge savings in development effort. Which one you choose (or whether indeed you choose to use either) depends a great deal on your data access layer of choice and the types of clients you intend to reach. This post tells you what you need to know to intelligently distinguish between them.

My upcoming book, Programming Microsoft SQL Server 2012, has an entire chapter dedicated to this topic. Look forward to extensive coverage and code samples for complete data access solutions using both platforms. The release date is just a few short months away, so stay tuned!

How Significant is the SQL Server 2012 Release?

SQL Server, particularly its relational database engine, matured quite some time ago. So the “significance” of every new release over recent years can be viewed—in some ways—as relatively nominal. The last watershed release of the product was actually SQL Server 2005, which was when the relational engine (that, for years, defined SQL Server) stopped occupying “center stage,” and instead took its position alongside a set of services that now, collectively, define the product. These of course include the Business Intelligence (BI) components Reporting Services, Analysis Services, and Integration Services—features that began appearing as early as 1999 but, prior to SQL Server 2005, were integrated sporadically as a patchwork of loosely coupled add-ons, wizards, and management consoles. SQL Server 2005 changed all that with a complete overhaul. For the first time, SQL Server delivered a broader, richer, and more consolidated set of features and services which are built into—rather than bolted onto—the platform. None of the product versions that have been released since that time—SQL Server 2008, 2008 R2, and now 2012—have changed underlying architecture this radically.

That said, each SQL Server release continues to advance itself in vitally significant ways. SQL Server 2008 (released August 6, 2008) added a host of new features to the relational engine—T-SQL enhancements, Change Data Capture (CDC), Transparent Data Encryption (TDE), SQL Audit, FILESTREAM—plus powerful BI capabilities with Excel PivotTables, charts, and CUBE formulas. SQL Server 2008 R2 (released April 21, 2010) was dubbed the “BI Refresh,” adding PowerPivot for Excel, Master Data Services, and StreamInsight, but offering nothing more than minor tweaks and fixes in the relational engine.

The newest release—SQL Server 2012—is set to officially launch on March 7, 2012, and like every new release, this version improves on all of the key “abilities” (availability, scalability, manageability, programmability, and so on). The chief reliability improvement is the new High Availability Disaster Recovery (HADR) alternative to database mirroring. HADR (also commonly known as “Always On”) utilizes multiple secondary servers in an “availability group” for scale-out read-only operations (rather than forcing them to sit idle, just waiting for a failover to occur). Multi-subnet failover clustering is another notable new manageability feature.

SQL Server 2012 adds many new features to the relational engine, many of which I have blogged about, and most of which I cover in my new book (soon to be published). There are powerful T-SQL extensions, most notably the windowing enhancements, 22 new T-SQL functions, improved error handling, server-side paging, sequence generators, and rich metadata discovery techniques. There are also remarkable improvements for unstructured data, such as the FileTable abstraction over FILESTREAM and the Windows file system API, full-text property searching, and Statistical Semantic Search. Spatial support gets a big boost as well, with support for circular data, full-globe support, increased performance, and greater parity between the geometry and geography data types. Contained databases simplify portability, and new “columnstore” technology drastically increases performance of huge OLAP cubes (VertiPaq for PowerPivot and Analysis Services) and data warehouses (a similar implementation in the relational engine).

The aforementioned features are impressive, but still don’t amount to much more than “additives” over an already established database platform. A new release needs more than just extra icing on the cake for customers to perceive an upgrade as compelling. To that end, Microsoft has invested heavily in BI with SQL Server 2012, and the effort shows. The BI portion of the stack has been expanded greatly, delivering key advances in “pervasive insight.” This includes major updates to the product’s analytics, data visualization (such as self-service reporting with Power View), and master data management capabilities, as well Data Quality Services (DQS), a brand new data quality engine. There is also a new Business Intelligence edition of the product that includes all of these capabilities without requiring a full Enterprise edition license. Finally, SQL Server Data Tools (SSDT) brings brand new database tooling inside Visual Studio 2010. SSDT provides a declarative, model-based design-time experience for developing databases while connected, offline, on-premise, or in the cloud.

Of course there are more new features; I’ve only mentioned the most notable ones in this post. Although the relational engine has been rock-solid for several releases, it continues to enjoy powerful enhancements, while the overall product continues to reinvent itself by expanding its stack of services, particularly in the Business Intelligence space.