Tuesday, September 28, 2010

Dynamic Lanugage Runtime

The dynamic language runtime (DLR) is a runtime environment which sits on top of the common language runtime (CLR) and provides a set of services for dynamic languages to run on .Net framework. It includes a set of Libraries and constructs to allow the objects to be identified at run time compared to the statically typed languages like c# where the object types have to be defined at compile time. 
Scripting/Interpreted languages like JavaScript, PHP can be good examples for Dynamic Languages.

Other Popular examples are Lisp, Smalltalk, Ruby, ColdFusion

Primary Advantages of DLR
  • Simplifies Porting Dynamic Languages to the .NET Framework
  • Enables Dynamic Features in Statically Typed Languages
  • Enables Sharing of Libraries and Objects
  • Provides Fast Dynamic Dispatch and Invocation

Architecture
The main services DLR offers to CLR include the following:
  • Expression trees: Used to represent Language semantics
  • Call site caching: It caches information about operations and types if already executed, so as to achieve faster processing.
  • Dynamic object interoperability: Provides a set of classes for the language implementers to use & extend their interoperability with .net

Thursday, September 23, 2010

Overview of Entity Framework 4.0


 "The Entity Framework bridges the gap between how developers commonly manipulate conceptual objects (customers, orders, products; posts, tags, members; wall posts, private messages, friend connections) and the way data is actually stored (records in database tables).  The technical term for a tool that provides this abstraction is object relational mapper (ORM)."

This blog gives a gist of what EF (Entity Framework 4) has to offer and how to program against it.

Best place to start http://msdn.microsoft.com/en-us/data/aa937723.aspx 

For beginners, I would recommend reading http://blogs.msdn.com/b/adonet/archive/2010/07/19/absolue-beginners-guide-to-entity-framework.aspx

I had been going through Entity framework lately to implement it in our next Project. To my surprise, the programming with Data (DB) has become very simple using this framework. It helps the developer to focus more on understanding the Business Domain and model the Business Entities than worry about the way to store and access the Database.

The framework provides ways to generate DB directly from the Modelled Entities. This approach is called Model-First approach and is usually recommended. However vice versa, DB to Entities can also be created. This helps if you already have the DB ready & still want to leverage on using the Framework.

LINQ queries or Lamda expressions are mostly used to perform CRUD operations against Business Entities instead directly against the Database.

If you open the .edmx file (Entity Model file) in an XML editor then you will basically see the following sections.
    * Storage Model - Defines the Entities, Entity Sets and Associations. All information required to create the Database will be picked from here.
    * Conceptual Model - Defines the Entities, Entities Sets and Associations that will be consumed from the Business Layer. Information for modelling (Diagram) will be picked from here.
    * Mappings/Associations - Mappings between the Storage and the Conceptual Model is defined here.

EntitySet is a pluralized version of the Entity.Few base classes that you need to be aware of are
    * ObjectContext is a base class for the Entity Container. Used like an container for Entites.
    * EntityContext is a base class for the Entity class.

By default, EF uses Lazy loading approach. It executes the queries as a single batch of commands. However there are ways to make explicit execution of the Query.


Few commands to Query, Update are like follows:
ctx.Contacts.Where(c=>c.SalesOrderHeaders.Any()).ToList()
ctx.Customers.AddObject(customer);
ctx.SalesOrderDetails.DeleteObject(order);
ctx.SaveChanges() - Triggers the execution of the Query.

There are 3 ways of programming against the Entitiy Framework.
  • LINQ to Entities - Write LINQ queries to perform operations on Entities.
  • Entity SQL - Use SQL strings as commands. However you are writing commands against the Entities and not DB.
  • Query Builder - Use the Methods provided with Entities Framework instead LINQ.

There are times when you would like to do some complex set of operations on a varied set of tables while interacting with Entities. This is when you could leverage on Entities to Stored Procedures. However there are some limitations on using Stored Procedure with Entities. (Something like if one of the operation, say Insert is connected to Stored Proc then other operations also have to be linked through Stored Proc).

Tracing the SQL commands.
During debug you would like to see what is the DB query that is being converted to for your LINQ operation. Usual approach is to use the SQL Profiler.

This could be time consuming to switch between VS 2010 and SQL Server. You can leverage on the Programming model tracing using System.Data.Objects.ObjectQuery.ToTraceString and System.Data.EntityClient.EntityCommand.ToTraceString methods, which enable you to view these store commands at runtime without having to run a trace against the data source.

LINQ TO ENTITIES
// Define an ObjectSet to use with the LINQ query.
ObjectSet products = context.Products;
// Define a LINQ query that returns a selected product.
var result = from product in products
where product.ProductID == productID
select product;
// Cast the inferred type var to an ObjectQuery
// and then write the store commands for the query.
Console.WriteLine(((ObjectQuery)result).ToTraceString());


ENTITY SQL
// Define the Entity SQL query string.
string queryString =
@"SELECT VALUE product FROM AdventureWorksEntities.Products AS product
WHERE product.ProductID = @productID";
// Define the object query with the query string.
ObjectQuery productQuery =
new ObjectQuery(queryString, context, MergeOption.AppendOnly);
productQuery.Parameters.Add(new ObjectParameter("productID", productID));
// Write the store commands for the query.
Console.WriteLine(productQuery.ToTraceString());


QUERY BUILDER
int productID = 900;
// Define the object query for the specific product.
ObjectQuery productQuery =
context.Products.Where("it.ProductID = @productID");
productQuery.Parameters.Add(new ObjectParameter("productID", productID));
// Write the store commands for the query.
Console.WriteLine(productQuery.ToTraceString());


You can retrieve objects from Entities using GetObjectByKey and TryGetObjectByKey methods on ObjectContext. This will return an object with the specified EntityKey into the object context. When you use GetObjectByKey, you must handle an ObjectNotFoundException.

The abstraction EF

http://msdn.microsoft.com/en-us/library/cc853327.aspx
http://blogs.msdn.com/b/adonet/archive/2008/02/11/exploring-the-performance-of-the-ado-net-entity-framework-part-2.aspx

What one must do to improve performance is to use Compiled LINQ to precompile the LINQ queries to ensure faster operations.
http://msdn.microsoft.com/en-us/library/bb896297.aspx will provide more information on this.


EF gets only the Entities data without it's related assoicated entities data. Ex: There may be a case where you would want to retrieve SalesPersons along with their SalesOrders Details.

There are ways you could inform the EF to retrieve all the related entities's data so that you do not end up using a foreach loop and trying to fill the related entities data. This would have resulted in far too many DB calls.


Using Query Paths you can preload the related Entities.

//When a n-level details have to be retrieved.
var contacts = (from contact in context.Contacts.Include("SalesOrderHeaders.SalesOrderDetails")
select contact).FirstOrDefault();

//When unrelated tables has to be included.
ObjectQuery query =
context.SalesOrderHeaders.Include("SalesOrderDetails").Include("Address");


Security

Can't end the article without mentioning about Security. I Just evaluated on the classic problem of SQL Injection. Your old techniques of having Parameterized Query is still valid in Entity Framework.

--SQL Injection possible.
context.ExecuteStoreQuery("select * from Products where pid = {0}", 1);
--Guarded against SQL Injection
context.ExecuteStoreQuery("select * from Products where pid = @p0", new SqlParameter { ParameterName = "p0", Value = 1 })


Overall, I am sure this is going to reduce the developer's work however adds a little more decipline to ensure that developers do not just plainly treat the entities just as a set of tables.

So far that's it I was able to read and evaluate about, Entity Framework. Hopefully I will add an advanced version of this article where I will try to touch base on the Transaction and Concurrency related stuff and little more detail on the coding with more snippets. Till then, Happy blogging!! :)

Don't forget to start reading about http://msdn.microsoft.com/en-us/data/aa937723.aspx

Saturday, September 11, 2010

Windows Server AppFabric Caching Framework


Windows Server AppFabric Caching, also called as Velocity is a framework for providing a unified Cache. Application Caching is nothing new and has been around for many years. It definitely saves those milliseconds or even more depending on the number of concurrent users fetching the data.

Earlier Cache used to be part of the Web Servers. Later, they moved into the Application Servers. However, the capabilities of the caching depended on the capacity of the Application server. With more and more concurrent users, it became important to have a much bigger caching servers. In order to prevent any bottlenecks on these servers, there was a need for a Caching framework which could provide all the capabilities of a load balanced Web Servers.




The following are some of the functionalities that was expected and delivered from the Velocity framework:
  • Load Balance of the Cache Servers
  • Provision to scale dynamically without having to stop the applications
  • High availability in case some Cache Servers goes down
  • Maintaining consistency in data copies stored across Cache Servers
  • Provide a mechanism to invalidate the Cache when the actual data store gets changed

To Install & Configure the framework follow the link http://msdn.microsoft.com/en-us/library/ff383731.aspx 

There are two places of storing the configuration.
  • Network Shared folder - Usually smaller applications (1-5 Servers)
  • SQL Server - For larger Enterprises (greater than 5 Servers)
Configuration can be done either Programmatically or by using the Windows PowerShell. The AppFabric/Velocity framework related commandlets will be installed as part of the framework.

The following are some of the terminologies that are used in Velocity Framework
  • Cache Server
  • Cache Host - Windows Service
  • Cache Client - Web Application Accessing the Cache 
  • Local Cache - Data is stored in memory of the Web Application
  • Server Cache - Data is serialized and saved in servers other than Web Application Server
  • Lead Host - Responsible for storing Cache and also to co-ordinate with other Hosts for managing the integrity of the other Cache Servers. 
  • Cache Cluster - A set of Cache hosts working together to provide a unified Cache view

There are two ways of partitioning a Cache. You could configure data to go into one of these partitions to effectively manage the cache for performance and also to isolate the invalidating effect on other cache data.
  1. Named Cache
  2. Named Region


Memory management
There are two ways of invalidating the Cache.
  • Timeout
  • Notification (Default polling interval of 300 secs) - Will check for any notifications programmatically.

Periodically, invalidated cache gets cleaned up for effective Memory management. However, there may be cases where the framework may choose to remove cache data when there is a crunch for memory. You application may get exceptions when they are programmed by completely expecting the data to be in cache. Hence it is a must to write your applications for such events.

The eviction strategy used by Velocity framework is LRU (Least recently) used. i.e. Old data gets destroyed first.

High availability is achieved by making copies of the Cache data. The number of copies that needs to be maintained can be configured.

  
Security
Security is done at Transport level. Velocity framework can be configured to allow only certain user-context to access the cache servers. You should allow the user context of the Web Application - Application Pool to have access to the Cache servers. 

Lastly, ASP.net 4.0 applications can leverage on the Velocity framework for storing the Session states.
Hopefully, I have touched some concepts of the new caching framework. Can't wait to implement this on my next project.
References:
http://msdn.microsoft.com/en-us/library/ee790954.aspx

Saturday, August 21, 2010

Rapid Development of Web Pages

Yesterday, I set myself to develop a simple web application and targeted an hour for it. I used VS 2010 for my development purpose.

The application will do the following:
  • List a set of users
  • List a set of books
  • Perform a simple search on the Users vs. Books borrowed (through IDs)
  • Allow users to borrow & return books. Entries to be made accordingly
The output turned out to be something like this:


The following three layers were considered:
  1. Database
  2. Data Access Layer
  3. UI/Presentation Layer
For brevity, I chose to ignore the Business Layer and any Service Layer.

Database Layer
Here I created a DB called Library and added three tables with the following constraints. The image below is self explanatory as to what each of these table do.

Data Access Layer
This was going to be a class library. I could have chosen to have this merged with the Web Application. But, thought adding some amount of modularizing won't hurt. 
Here I chose to add an ADO.Net Entity Data Model. This will give a data access view of the tables in your .Net application. Follow the steps below to create it:
Add New Item on the DAL > Visual Studio C# Items > Data > ADO.Net Entity Data Model > Specify a valid Model name say, LibraryModel.edmx and go to the next page.
Select Generate from Database and go to the next page.
Select the Database connection and rest leave it with defaults. You may want to change the Identifier for the connection string that will go to App.Config/Web.Config. Say, LibraryEntities
Select the Tables you would want to make it visible in the .Net application and specify the Model namespace say, LibraryModel and click on Finish.

Your class library should look something like this:

Now, I simply created a simple Facade to wrap some operations of the DB operations. I called that class as LibraryManager.cs
This class used Linq to Entities for faster development towards the tables. 

The file looked something like this:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;

namespace DataAccess
{
    public class LibraryManager
    {
        private LibraryEntities dbContext;

        public LibraryManager()
        {
            dbContext = new LibraryEntities();
        }

        public List GetBooksList()
        {
            var query = from p in dbContext.tblBooks
                        select p;
            return query.ToList();
        }

        public List GetUsersList()
        {
            var query = from p in dbContext.tblUsers
                        select p;
            return query.ToList();
        }

        public bool AddBook(tblBook book)
        {
            dbContext.AddTotblBooks(book);
            dbContext.SaveChanges();
           
            return true;
        }

        public bool AddUser(tblUser user)
        {
            dbContext.AddTotblUsers(user);
            dbContext.SaveChanges();

            return true;
        }

        public bool BorrowBook(int nBookId, int nUserId, DateTime date)
        {
            tblUserBook userBook = new tblUserBook();
            userBook.nBookId = nBookId;
            userBook.nUserId = nUserId;
            userBook.sBorrowedDate = date;
            dbContext.AddTotblUserBooks(userBook);
            dbContext.SaveChanges();

            return true;
        }

        public List SearchBorrowedItemsByBookAndUser(int nBookId, int nUserId)
        {
            var query = from p in dbContext.tblUserBooks
                        where (p.nBookId == nBookId && p.nUserId == nUserId)
                        select p;
            return query.ToList();
        }

        public bool ReturnBook(int nUserBookId, DateTime date)
        {
            tblUserBook userBook = dbContext.tblUserBooks.Single(p => p.nId == nUserBookId);
            userBook.sReturnedDate = date;
            dbContext.SaveChanges();

            return true;
        }
    }
}
 

UI/Presentation Layer
Here I just created a simple Web Application and added a new item "Web Form using a master page" called User.aspx which is tied to the default master page.

In order to access the Database through the Entity Framework, you need to have the connection string set in your web.config.
Copy the connection string from the App.config of your Data Access Layer to your web.config

User.aspx had the following controls
Grid views to show a list of Users, Books & User vs Books Search results
Other simple controls to perform Borrow and Return book feature

Important to highlight is to see how easy it is to bind the List returned from DAL into the Grid.

Observe the ones highlighted in bold below.

Users.aspx
<%@ Page Title="" Language="C#" MasterPageFile="~/Site.Master" AutoEventWireup="true" CodeBehind="Users.aspx.cs" Inherits="Library.Users" %>
<asp:Content ID="Content1" ContentPlaceHolderID="HeadContent" runat="server">
</asp:Content>
<asp:Content ID="Content2" ContentPlaceHolderID="MainContent" runat="server">
    <asp:GridView ID="gvUsers" runat="server" AutoGenerateColumns="false">
    <Columns>
       <asp:BoundField HeaderText="ID" DataField="nId" />
       <asp:BoundField HeaderText="Name" DataField="sName" />
    </Columns>
    </asp:GridView>

    <br />

    <asp:GridView ID="gvBooks" runat="server" AutoGenerateColumns="false">
    <Columns>
       <asp:BoundField HeaderText="ID" DataField="nId" />
       <asp:BoundField HeaderText="Name" DataField="sName" />
       <asp:BoundField HeaderText="ISBN" DataField="sISBN" />
       <asp:BoundField HeaderText="Is Active" DataField="bActive" />
    </Columns>
    </asp:GridView>

    <br />
    <asp:Label ID="lblBorrowMessage" runat="server" Text=""></asp:Label>
    <br />
    <table>
    <tr>
    <td><asp:Label ID="lblSearchBookId" runat="server" Text="BookId"></asp:Label></td>
    <td><asp:TextBox ID="txtSearchBookId" runat="server"></asp:TextBox></td>
    <td><asp:Label ID="lblSearchUserId" runat="server" Text="UserId"></asp:Label></td>
    <td><asp:TextBox ID="txtSearchUserId" runat="server"></asp:TextBox></td>
    <td><asp:Button ID="btnSearchBookUserId" runat="server" Text="Search"
            onclick="btnSearchBookUserId_Click" /></td>
    <td><asp:Button ID="btnBorrowBook" runat="server" Text="Borrow" onclick="btnBorrowBook_Click"
             /></td>
    </tr>   
    </table>
    <br />
    <asp:GridView ID="gvBookUserId" runat="server" AutoGenerateColumns="false">
    <Columns>
       <asp:BoundField HeaderText="ID" DataField="nId" />
       <asp:BoundField HeaderText="Borrowed Date" DataField="sBorrowedDate" />
       <asp:BoundField HeaderText="Returned Date" DataField="sReturnedDate" />
    </Columns>
    </asp:GridView>

   
    <br />
    <asp:Label ID="lblReturnMessage" runat="server" Text=""></asp:Label>
    <br />
    <table>
    <tr>
    <td><asp:Label ID="lblReturnBook" runat="server" Text="BookUserId"></asp:Label></td>
    <td><asp:TextBox ID="txtBookUserId" runat="server"></asp:TextBox></td>
    <td><asp:Button ID="btnReturnBook" runat="server" Text="Return"
            onclick="btnReturnBook_Click"/></td>
    </tr>   
    </table>
    <br />


</asp:Content>



Users.aspx.cs
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.UI;
using System.Web.UI.WebControls;

using DataAccess;

namespace Library
{
    public partial class Users : System.Web.UI.Page
    {
        private LibraryManager libraryMgr;

        protected void Page_Load(object sender, EventArgs e)
        {
            libraryMgr = new LibraryManager();
            gvUsers.DataSource = libraryMgr.GetUsersList();
            gvUsers.DataBind();
            gvBooks.DataSource = libraryMgr.GetBooksList();
            gvBooks.DataBind();

            lblBorrowMessage.ForeColor = System.Drawing.Color.Black;
            lblBorrowMessage.Text = "";
        }

        protected void btnSearchBookUserId_Click(object sender, EventArgs e)
        {
            int nBookId = Convert.ToInt32(txtSearchBookId.Text.Trim());
            int nUserId = Convert.ToInt32(txtSearchUserId.Text.Trim());
            gvBookUserId.DataSource = libraryMgr.SearchBorrowedItemsByBookAndUser(nBookId, nUserId);
            gvBookUserId.DataBind();
        }

        protected void btnBorrowBook_Click(object sender, EventArgs e)
        {
            int nBookId = Convert.ToInt32(txtSearchBookId.Text.Trim());
            int nUserId = Convert.ToInt32(txtSearchUserId.Text.Trim());
            if (libraryMgr.BorrowBook(nBookId, nUserId, DateTime.Now))
            {
                lblBorrowMessage.Text = @"Data has been successfully recorded. Have fun reading.";
                lblBorrowMessage.ForeColor = System.Drawing.Color.Green;
            }
            else
            {
                lblBorrowMessage.Text = @"There was an error while trying to Borrow the Book.
                                        Contact the Administrator.";
                lblBorrowMessage.ForeColor = System.Drawing.Color.Red;
            }
        }

        protected void btnReturnBook_Click(object sender, EventArgs e)
        {
            int nBookUserId = Convert.ToInt32(txtBookUserId.Text.Trim());
            if (libraryMgr.ReturnBook(nBookUserId, DateTime.Now))
            {
                lblReturnMessage.Text = @"Data has been successfully recorded. Hope you had fun reading.";
                lblReturnMessage.ForeColor = System.Drawing.Color.Green;
            }
            else
            {
                lblReturnMessage.Text = @"There was an error while trying to Return the Book.
                                        Contact the Administrator.";
                lblReturnMessage.ForeColor = System.Drawing.Color.Red;
            }
        }
    }
}

Just to ramp up, things that helped me in achieving my one hour target was:
  • Using Mangement Studio for SQL Server 2008
  • Creating ADO.Net Entity Data Model
  • Using LINQ to do CRUD on the table data
  • Using Grids and Binding data
Two links worth mentioning are:
http://www.telerik.com/help/silverlight/consuming-data-linq-to-ado-net-entity-data-model.html
http://www.mikepope.com/blog/DisplayBlog.aspx?permalink=1419&count=no 

Well just a little more than a hour, i was able to develop what I set out for. So Mission accomplished :)

Thursday, August 19, 2010

Centralized Content Type Hub for publishing

This article will demonstrate the steps to create a central repository for storing all the content types in a single Site collection and allow it to be used by all the Site Collections (from any Web Application in the farm).

This feature is new in SharePoint 2010. In MOSS third party components had to be used to achieve this kind of support.

In order to achieve the below steps needs to be followed:
1. Configure the Metadata Service Application (Properties). Type the site collection URL which will be used for publishing all the Content Types. Make up your mind to use that site collection as a Hub as there is no modifying feature of the hub. You will have to create another Metadata Service Application to change the hub.

Ex of URL is http://mtalt005:85/sites/hub/


2. Configure the Connection of the Metadata Service Application (Properties). You can find this below the Managed Metadata Service Application. Enable the Site collections to be able to consume Content Types from the above hub.



3. Enable the Site Collection Feature "Content Type Syndication Hub" of the Hub Site collection.



4. Now create a Content Type in the hub site collection. Site Settings > Site Content Types > Create
Say, you give an AircraftImage whose Parent Content Type is an Image from the Digital Asset Content Types group.

5. Once the Content Type is created you will be able to publish them for other site collections to consume. To publish, go to the content type created and under the settings "Manage Publishing for this content type" select OK. If it is already published then you will have options to unpublish (existing content types consumed will de-linked and become local copies) or republish (in case of updates to the content type). All subscriptions are driven by the timer jobs.

The below Jobs have to be started forcibly (through Run Now) or wait till the scheduled time.
Central Admin > Monitoring > Review Job Definitions.
Content Type Hub 
Content Type Subscriber 

Note: You may still need to wait until the previous jobs are run. It is a queue basis. So have patience.
If you don't see the jobs running then, Ensure that the SharePoint timer is running. Check your Windows Service status of "SharePoint 2010 Timer".
This is the cpu intensive service. I always prefer to keep it OFF on my dev box.

6. Once the Synchronization is done you should be able to see the Content Types in any of the Site collections (Team Sites) from any web application that has been configured for the Subscription (i.e the Content Type Subscriber Job for that web application has run).
To see the subscribed list of Content Types,
Site Settings > Site Collection Administration > Content Type Publishing



With this now you are ready to have a centralized Content Type publishing. This will help you in managing the updates for each of the content types.

Tuesday, August 10, 2010

Usability Enhancing features in SharePoint 2010


Some of usability features that I came across in SharePoint 2010:

You can click on the Edit (Page Ribbon button) and add content inline while viewing the page. Use the format Text to help in formatting the font and color of the text.



The Insert Ribbon at the Edit mode gives a nice set of capabilities that enhances the authoring capabilities. 


You can now,
  • Add an external Link, On click of which will browse the page to that link.
  • Point to an Internal SharePoint List Item. To do this just type [[ in the Edit mode. You can follow the intellisense to the required list item. In view mode, On click of it, will show the List item in a Ajax enabled dialog. Hence keeps you within the context of the page and also showing the content of the List item. You Upload a document in a dialog and provide the link to it.
  • Upload an image into one of the document libraries from local computer and get them rendered inline on the page.
  • Link an image from an external (Internet) source and render them inline on the page.


The Designer ribbon gets enabled when working with Images. You could do some basic digital manipulation on images like setting the aspect ratio, borders & positioning on the page.


These are some of the capabilities that has made Rich content authoring in SharePoint 2010 easy.

Friday, August 6, 2010

Windows Workflow Foundation 3.5 Overview

Any business process with predefined set of tasks/activities is a good candidate for Workflow. Especially, that takes time between each task like approval process, waiting on external events to be triggered like file drop or based on a new request from client through say SharePoint list item added.

Without workflow, the processes (applications) had to have its instance always waiting which wasted a lot of CPU cycles. WF provides capability to share the data between each task of the workflow instance. Ability to persist the state enables the application to be stopped when waiting on an event and again be invoked when the event occurs. It is not required for all the activities in a particular instance of a workflow to be executed by the same thread or even the same process.

Good Use Case would be for ASP.Net pages. They can be modeled in a WF (ex: purchase WF). It gives capabilities to change the WF dynamically without having to touch the pages.

Advantages of WF Framework
  • Componentizing of code to execute a different chunk of software at each level
  • Tools to create and modify workflows graphically
  • Ability to monitor the status of the Workflow
  • Rich error handling and provision for rolling back an activity
  • Dynamically modifying the workflow by adding a task

The fundamental components of WF are the following:
Activity: A unit of work. (Some Business Process Managements use the term 'Task')
  • Workflow: A set of activities. Defines the flow of the activities based on business logic.
  • Workflow Designer: a graphical tool, typically hosted by Visual Studio that can be used to create and modify WF workflows and activities.
  • Base activity library: Standard out of the box activities provided by WF.
  • Runtime engine: a WF-supplied library that executes workflows. The runtime engine also provides other services, such as mechanisms for communicating with software outside the workflow. I.e. Assembly method, WCF Service etc.
  • Runtime services: a group of services that workflows can use, including support for transactions, persisting a workflow’s state, tracking its execution, and more.
  • Host process: a Windows application that hosts the WF runtime engine and any workflows it executes.
 There are two kinds of WF
  1. Sequential (has a start and end)
  2. State Transition (jumps from one state to another based on the actions)
Checkout the msdn for the list of Activities of the base activity library.

You can Developing Workflows using Visual Studio designer or by writing code directly writing code.

You will use the following namespaces while coding.
System.Workflow.Activities, System.Workflow.ComponentModel, and System.Workflow.Runtime

Snippet to create a Workflow would be as follows:
 using System.Workflow.Activities;
public class ExampleWorkflow : SequentialWorkflow
{

}

Workflows can also be defined using the XML-based extensible Application Markup Language (XAML).
ClrNamespace="System.Workflow.Activities"Assembly="System.Workflow.Activities" ?><="" x:class="ExampleWorkflow" font="">xmlns="Activities" xmlns:x="Definition">

Simple way to create an activity in C# is
using System.Workflow.ComponentModel;
public class ExampleActivity : Activity
{

}
 Common Hosting Program will be something like this,
using System.Workflow.Runtime;
class ExampleHost
{
    static void Main()
    {
        WorkflowRuntime runtime = new WorkflowRuntime();
        runtime.StartRuntime();
        runtime.StartWorkflow(typeof(ExampleWorkflow));
        …
    } 
}

WF ships with a set of services that allow the runtime to execute within ASP.NET applications.

The ASP.NET-based host that ships with WF relies on SQL Server for persistence.

The WF runtime acts as an intermediary for all communication with all workflows. i.e even for unloaded workflow which are waiting on an input/event to execute the activity.

To communicate with other objects in the same Windows process, a workflow can use two activities in the base activity library.

  1. CallExternalMethod activity: allows calling a method in an object outside the workflow
  2. HandleExternalEvent activity: allows receiving a call from an object outside the workflow.
In order to communicate with the WCF there are two standard Activities
  1. Send: Sends a request using WCF, then optionally waits for a response.
  2. Receive: Receives an incoming request via WCF, then sends a response.

WF ships with some built-in support for roles defined using Windows accounts, Active Directory, and ASP.NET, but other role management systems can also be used.

Windows Workflow Foundation and Windows SharePoint Services:
Using standard WF tools, such as the Workflow Designer, developers can create workflow applications that address document collaboration and other kinds of information sharing.

Conclusion:
A very generic framework for designing and developing of Business Processes has been created in the form of Windows Workflow foundation. This should help the developers to speed up the programming and also ensure a better utilization of the Server Process CPU by unloading the idle Processes waiting on external inputs.
Using the framework, developers can also write tools that can help the Business users to track the status of each of the Workflow instances for better management purpose.

Wednesday, August 4, 2010

Claims based security model in SharePoint 2010

In this article I will try to touch base briefly on the Claims based Security Model support in SharePoint 2010.

 
SharePoint comes with two kinds of authentication when you are creating a new Web Application.
  1. Classic Mode Authentication (default)
  2. Claims Based Authentication

 

 
Classic Mode is just the Windows based authentication and is used for backward compatibility.
The new mode of authentication i.e Claims Based works around the concept of an Identity which is based on the standards of WS-Federation, WS-Trust and Protocols like SAML (Security Assertion Markup Language)
It provides a generic way for applications to acquire identity information from users in/across organizations and also on internet.
Identity information is contained in a security token, often simply called a token. A token contains one or more claims (trusted information) about the user. This information stays with them throughout their session.

 
This is developed on the Windows Identity framework (WIF). Features of claims-based identity include
  • Authentication across users of Windows-based systems and systems that are not Windows-based.
  • Multiple authentication types.
  • Stronger real-time authentication.
  • A wider set of principal types.
  • Delegation of user identity between applications. (Can resolve Double Hop issues easily)

 
Out of the box, SharePoint supports authenticating using Windows & Forms (both supported in MOSS), LiveID/OpenID. However, integrating with custom Authentication providers is easily possible as long as the application can trust the Issuing Authority of the Security Tokens.

 
Definitions of some of the concepts that you need to be aware of are as follows:
  • Identity: security principal used to configure security policy
  • Claim: attribute of an identity (Login Name, AD Group, etc)
  • Issuer: trusted party that creates claims
  • Security Token: serialized set of claims in digitally signed by issuing authority (Windows security token or SAML)
  • Issuing Authority: issues security tokens knowing claims desired by target application
  • Security Token Service (STS): builds, signs and issues security tokens
  • Relying Party: application that makes authorization decisions based on claims

 
There are two cases of Claims Incoming and an Outgoing. The scenarios are different in the way they get authenticated or validated. See the images below from the MSDN article.
 
 
Hope this gives you a start in understanding a very high level concepts of claims based Security Model in SharePoint 2010.

Tuesday, August 3, 2010

HTML5! The next generation of HTML

Current browser applications have gone through a phenomenal change. The demand for the market to have richer client has pushed applications like Silverlight, Flash, Java Applets etc. onto the browsers.

However, HTML is still a base for most of the Web Applications (including SharePoint), when it comes to support for all browsers. In order to reduce the dependency of such third party plugins, it was necessary to add more controls into the HTML vocabulary.

HTML5 is still a work in progress. However, most modern browsers have some HTML5 support. Sadly, it is not supported on IE8

One particular thing interested me in HTML5. It’s capability to store data in the client.
It offers two new methods for storing data on the client:
• localStorage - stores data with no time limit
• sessionStorage - stores data for one session

Earlier, cookies were used to store data. It only stored limited information that represents the user. In most cases just a sessionID was stored in Cookie while the actual data was stored in Sessions at Server. The reason of doing that was to reduce the overhead of passing the data between the client and server for every request.

In HTML5, the data do not have to be passed on by every server request, but used ONLY when required. It is possible to store large amounts of data without affecting the website's performance.

The data is stored in different areas for different websites, and a website can only access data stored by it.

I am sure this will open the doors for a lot of offline capabilities of the web sites.

Monday, August 2, 2010

Overriding SharePoint My Profile: Ask me About (Expertise) links

My Profile home page comes with a Ask Me About web part. Which is nothing but a list of Expertise each user would have updated the User Profile to let the others know about the expertise.
Any users who browser through the profile and clicks on the Expertise link then it will open up in the Noteboard web part. This posts the question to the person who has updated the expertise.
If the Noteboard web part is disabled then it opens the question in the Mail client available in the machine.

There was some bug in this that it opens up a mail client and also a blank browser window. On closer observation found that clicking on the Expertise link was calling a Java Script function NavigateToNoteboard().
This javascript was coming from the WebPart whose page, I was not able to find or did not want to change as it was a standard web part being used every where.

This is where I figured out that you could override javascript and the behavior is that the last javascript in a page overrides the others. So a simple hack of overriding this javascript immediately after the Web Part control registration solved the problem.

Here is the link that I used in my overriden Javascript for opening the question in  a mail client.

http://www.webmasterworld.com/javascript/3290040.htm


document.location.href = "mailto:" + toAddress + "?subject=" + subject;

Hope this was useful.. In the above, I was able to showcase
  1. The bug that opens a blank web page when Noteboard web part is removed.
  2. Overriding the Javascript
  3. Opening a mail client through JavaScript

Sunday, August 1, 2010

Some Usability concerns on SharePoint 2010

This article lists some of the UI concerns faced during my work in the last 3 weeks. Hopefully Microsoft improves the some of these in the next patch.

  1. User Profile Property view takes a lot of time to load: No idea what is going on here.
  2. User Profile AD Sync with users 35K records takes a lot of time in loading the top level tree. To even expand the tree it does go through minutes of wait time. There is no visual indicator during the wait.
  3. User Profile Property assoication with BDC: Here the default view makes the user to beleive that the association is already made.
  4. External content type picker doesnt show proper values. It just shows the Source i.e DB Name for SQL Server type. It gets very confusing when there are multiple Content Types which share the same db but map to different tables.
  5. View distorts the user id. Especially after setting the admin credentials for the Secure Storage Service. Ex: server\administrator becomes  i:0#.w server\administrator ;  This makes the user thinking if he has given the correct values or not.
  6. Changing the master page on the Search Site Collection and trying to add a web part page did not provide a scroll bar to find the Add button. Luckily the tabs were working.
  7. MySite manage keywords page is skewed.
I am all excited to head back home from my SharePoint onsite assignment. If my next assignment is in SharePoint then I may add more stuff under this blog.

Saturday, July 24, 2010

Cannot Edit the Associated LOB through BCS Association

There is no way (through SharePoint designer) to provide edit capabilities on a BCS when it is in association (through a foreign key) with another BCS.

Read on, to know how I tried to attempt it..

Today I was able to work on the Write back capabilities of the BCS. It is straight forward to create a BCS on a SQL Server Table.
I just went ahead with Create All operations on my Data Source.
In the wizard, make sure to uncheck the Read-Only property for those columns which need to be Updatable.
Once you are done with the wizard click Finish.

So far, so good. But, I wanted to see how BCS behaves when I have associated tables. I wanted two sets of data coming from different sources say one coming from SQL Server and other from WCF. For the testing, I just tried with another SQL Server Table which had 1:1 association with the first one.
The first table is like tblUserExtended and the second one which had association was tblUser
Both these tables were linked through the tblUser::nUserId and tblUserExtended::nUserId (Think of this as another source which has some extended user data, stored for another application that has to be aggregated)

Follow the msdn article (http://msdn.microsoft.com/en-us/library/ff728816.aspx) to create the association between two BCS entities.

The reason I did that was to have a single external list which will list columns from both User and UserExtended tables and allow the SharePoint user to edit some columns of User and some columns of UserExtended.
SharePoint showed only the columns of tblUser. While the foreign key tblUser::nUserId represents as a tuple internally. So I could not edit columns of UserExtended in that view but association of the UserExtended tuple could be done.

SharePoint out of the box do not support editing the associated BCS (at least for the External Content types created using SharePoint designer).

May be there is a way to do it, but, I could not find it. Let me know if you have a solution for this.

Friday, July 23, 2010

Programming using BCS (Business Data Connectivity) API

All Searches in Bing & Google for BCS API leads to writing .net connector to BCS. But today I had to write some complex component which required me to write code against BCS to retrieve LOB data.
I know you may ask why BCS when you already have connectivity to the source of BCS. The reason being, I wanted to use BCS to work something like an Entity model and used in context of SharePoint 2010.

Thanks to chaholl's article I was able to do perform it easily.. Here is the link for it.
http://www.chaholl.com/archive/2010/03/15/creating-a-web-service-to-access-bcs-data-in-sharepoint.aspx

Today, I also downloaded the SharePoint 2010 SDK. It comes with a lot of samples and help when programming with SharePoint. It hardly takes like 200MB of your hard drive and is definitely worth that space, if you are planning to do a lot of SharePoint coding.

Something that I came across in the SDK articles that is worth the mention,

Restrictions in BDC:
BDC does not support the ICollection or IEnumerable interfaces to represent collections in data structures, as well as the generic ICollection, IEnumerable, and IList interfaces. All collections must implement IList.
Finder method returning multiple items, we require the return value to implement either IEnumerable or IEnumerator (except for database, where we support onlyIDataReader). BDC does not support generic versions of these IEnumerable and IEnumerator.

Wednesday, July 21, 2010

Displaying Business Data List Web Part based on a Query String value.

Here I will demonstrate how to display the External Content data in a filtered (at the web service) list format based on a Query String value.

I would recommend you to read my previous blog, titled: “BDC/BCS integration using WCF Service” to get more clarity on the how I created the External content type.

Background: The External content is being serviced through Web Service integration of Business Connectivity Service.
Web Service Finder (Read List) Operation takes in a parameter called userName and returns a List collection.

It was straight forward to select Business Data List web part.
1. Go to a page where the Web Part is to be added.
2. Site Actions -> Edit Page ->
3. Select Edit Tools (Insert) -> Select some location on the canvas and Click on Web Part.
4. Category: Business Data -> Web Parts: Business Data List
5. Click on Add
6. Configure the Data List Web Part by clicking on OPEN THE TOOL PANE link.
7. On the Properties pane on the right hand side, Select the External Content Type and click on OK.

Things that made it more complicated was, I needed to display the list based on a filter that is dynamic and the UI should not have a filter to select from. Follow the steps below to get things working in the way it has to be.

I had defined a Filter while defining the External Content Type let's say the name of the filter used was filterParam: Comparison

Create another web part to service the Data List web part with a dynamic value (Say a query string)
1. Category: Filters -> Web Parts: Query String Filters
Note: This web part will not be visible to configure unless you click on the Edit Web Part of the Business Data List Web Part.
2. Click on the Open the Tool Pane of the Query String Filter Web Part.
3. Set the Filter Name and Query String Parameter Name as 'userName'.
4. Modify the Advanced filter if required to configure no query/multi value query.
5. Click OK.

Create an association between the Query String Filter Web Part and the Business Data List Web Part.
1. Click on the Connections (found below the Edit web part) of the Data List Web Part.
2. Get Query Values From -> Query String URL Filter.
3. This will show a pop up to select the External content type user parameter ('filterParam' filter as defined in the external content type) to which to associate the query string.
4. Select the parameter and click Finish.

Finally, Ensure you do a Save and Close on the Ribbon to save the added two web parts with its configurations.


Test the URL with different query string value
Ex: http://server:port/sites/teams/SitePages/Collegues.aspx?userName=Bob

It should work! If not then Bing it and find it. That's what I did :)

To hide the filters on the web part and also to use the My Profile property, as a filter, do the following.
1. Click on the Edit Web of the Data List Web Part.
2. You should see an 'Edit View' link on the web part.
3. You can use filters like User Profile Properties on which the External Content type will filter. i.e. will be passed as a parameter to the Web Service.
4. Disable the Allow user to change the criteria check box if you do not want to see the filters on the web part.

Tuesday, July 20, 2010

BDC/BCS integration using WCF Service

I am sure you will find thousands of links on Business Connectivity Services (BCS) integrating with WCF Services. But, still I was not able to get it straight. I am documenting the pain points I faced hoping it will save that precious day.
As, the new term in SharePoint 2010 is BCS, I will try not to use its predecessor term BDC (Business Data Catalog). However both these terms will be interchangeably used in the blogs you find on net.
Following things you may want to verify when your WCF integration to your LOB is not working.
1) Ensure that you deploy WCF onto a web site in IIS. Don't try to host the WCF on the Visual Studio by debugging. It will throw an error that endpoint cannot be found.
Don't blame the SharePoint designer for not able to find the endpoint hosted on the Visual Studio.

2) When giving the URL at the SharePoint Designer give a fully qualified name of the machine and not just the machine names.
Local hosts are not welcome
You could try to add an entry in the hosts file of c:\windows\system32\drivers\etc\host
I added something like,
127.0.0.1 customers.server.com
My Service options were something like this,
http://mtalt005.server.com:8888/Collegue.svc?wsdl

3) Ensure that the WCF Service has anonymous access. Or else make sure that the service account running the WCF Service has appropriate permissions.
If appropriate credentials have to be used then you may want to consider adding an account in the Secure Storage Service of SharePoint and add that account in the Secure Storage Application ID when configuring the BCS.

4) Your Service should return a Composite (custom class) Object for Specific Finder i.e. ReadItem operation and an array of composite objects for Finder i.e. ReadList Operation. String or Array of Strings is not composite objects.

5) You will have to define the Filters when you are defining the input for the Read List Operations. This will be useful when you are expecting some input and return a list of objects.
Ex: Pass UserId as input to WCF Operation and expect a list of Collegues Objects.
// Sample
public class ColleguesSvc : ICollegue
{
public List GetAllCollegues(String userId)
{
// Read DB for the userId and get a list of Collegue objects and return them.
}
}
In the above case I had to define a filter at the Input Form of the ReadList Wizard.
FilterType = Comparison
Operator = Equals

6) Ensure that the Input parameter is String and not Int32. --- I am not sure about this. But I was facing some issue while doing this.

7) Ensure that you developed the WCF Service using Target Framework as 3.5
Though having 4.0 should not harm but just that you will have to make additional provision of reinstalling the IIS to ensure that it is compatible with 4.0. Windows Server 2008 R2 default comes with Framework 3.5. If you really want the WCF service to be 4.0 then make sure to have the application pool to be running under 4.0 and not 2.0

In the next blog I will mention about displaying the WCF content on a List.

Friday, July 16, 2010

First take on SharePoint 2010

With no prior experience on MOSS 2007, I thought it would be hard to adopt SharePoint 2010. Enrolled to a four day Training course and was on to an assignment in SharePoint 2010.

I will detail my experience in the subsequent blogs about my learnings in any topics that I come across in SharePoint 2010.

Overall, I feel Microsoft has done a great job in providing a lot of developer tools and also in improving the User Experience.
Hats on for the exhaustive collection of the MSDN articles. They have structured it very well.

Start with the following links:
http://technet.microsoft.com/en-us/library/cc794341.aspx
http://msdn.microsoft.com/hi-in/sharepoint/bb964529(en-us).aspx
http://technet.microsoft.com/hi-in/sharepoint/ff601871(en-us).aspx

A lot of improvement in both Office and SharePoint 2010 and collaboration within these products.

My best takes:
Service Application Architecture enhancing the SSP architecture.
Offline capabilities through SharePoint Workspace.
Office on the web.
Better UI navigation compared to its predecessors.

Some tools that greatly enhance the capabilities of a SharePoint developer are
Visual Studio : Through a lot of plugins and templates.
SharePoint Designer : Rapid UI development

The collaborative capabilities of these applications make it more attractive for the organizations to think about investing in SharePoint 2010 to ensure that they are able to make the best use of their data that is not reaching to all of the information workers.