Wednesday, October 13, 2010

Troubleshoot on Setup of PowerPivot for SharePoint

I know it can be really frustrating when you spend long hours just trying to setup something on your developer box. Especially, when you had followed the instructions exactly (maybe not) as in the MSDN article. 

The following MSDN article guides you to setup PowerPivot for SharePoint. 

I will try to highlight some of the gotchas that you need to watch out for, while you perform the setup.

The following were the configuration that I have on my dev box.
  • Domain Controller with Active Directory 
  • Windows Server 2008 R2
  • SharePoint 2010
  • SQL Server 2008 R2 Dev edition (With SSIS, SSAS & SSRS)
  • Office 2010
  • .Net 3.5 SP1 or above
  • PowerPivot Add-in for Excel 2010  (http://www.powerpivot.com/download.aspx)

While following the Setup article, you may find some weird error like.

Failed to load receiver assembly "Microsoft.AnalysisServices.SharePoint.Integration, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91" for feature "PowerPivot" (ID: f8c51e81-0b46-4535-a3d5-244f63e1cab9).: System.IO.FileNotFoundException: Could not load file or assembly 'Microsoft.AnalysisServices.SharePoint.Integration, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot find the file specified.
File name: 'Microsoft.AnalysisServices.SharePoint.Integration, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91'


If you see the above error then you will be really interested in the following detail of important steps.
  1. In the Installation Type step of the wizard, select “New installation or add shared features”.  DO NOT select the “Add features to an existing instance…” option.
  2. In the Setup Role step in the wizard, select the “SQL Server PowerPivot for SharePoint” radio button.  DO NOT select the SQL Server Feature Installation option.In the Instance configuration step, keep the Instance ID as POWERPIVOT. Keep the SQL Server Analysis Services of default MSSQLServer seperate from this instance required for PowerPivot 
  3. In the Service configuration step, provide a domain account that has rights on both the SQL instances. I know it is not a good practice, but what the hell, this is supposed to be a developer box. I used the domain admin :)
  4. In the Read to Install step, you will have to browser to the configuration file path shown at the bottom of the dialog. Modify the FARMADMINPORT value to point to the existing SharePoint Central Administrator port.
  5. Before you move to the next step in the wizard,
    • Ensure that you have the Microsoft.AnalysisServices.SharePoint.Integration.dll in C:\Program Files\Microsoft SQL Server\100\Setup Bootstrap\SQLServer2008R2\x64. This dll can be found in GAC from one of your previous failed attempts. 
    • Remove all instances of the Microsoft.AnalysisServices.SharePoint.Integration.dll, especially from the GAC. Find other instances of it (temp folders of GAC) and delete them as well. 
    •  Note: If you are not able to delete some file instance because some process is holding it up then find out the process and go to the Task manager or the Windows Service to terminate/stop the process. Usually it will be w3wp.exe, OWSTimer, Web Analytics services. You can use the Process explorer to identify the holding process. http://technet.microsoft.com/en-us/sysinternals/bb896653.aspx 
  6. Good Luck here.. before you press that Install button. 

Have patience in the Last step. it could take a while to finish the last bit. I had to start the SharePoint timer service thinking, maybe the FarmDeployment workflow was not getting completed because of a timer service was terminated in the step 5 (I am not sure about this. Just in case you face some problems).
A comprehensive guide to setup is avaialble here




Post the installation, for configuration you can follow the MSDN article.

Just before you leave one Tip,
Once you install PowerPivot Add-in to Excel you may see that loading of Excel becomes slow. If you may wish to enable/disable this add-in then go to Excel -> File -> Options -> Add-ins -> Manage (COM/Disabled Add-ins)



Hopefully, this information has saved you that precious time.

Now, that I have been able to setup, I am waiting to explore PowerPivot for SharePoint in detail. If I come across anything interesting then I will share my views in the next article. Till then stay tuned :)

Friday, October 8, 2010

Excel Services (SharePoint 2010)

Excel 2010 is an authoring tool while Excel Services is more of a reporting tool.
The following are the main uses:
  • Sharing & Collaborating spreadsheets through browser.
  • Report Building and publishing using dashboards.
  • Provide a Web-service interface to allow applications to interact with the Spreadsheets stored at Server.
What's new for Excel Services (SharePoint Server 2010)?

SharePoint_ExcelServicesTop10Features
  • Multi-user Collaboration : This helps in simultaneously authoring a Workbook by more than 1 person. This feature is there in the SkyDrive. You may want to try it out.
  • Slicer feature : The Slicer is a Business Intelligence data filter in Microsoft Excel 2010. It helps in creating an interactive and flexible design layout for analyzing the business data while using PivotTables and OLAP functions.
  • Ajax Enabled : Service is Ajax enabled and hence changes to a range of cells will only require to refresh these cells instead of the entire page.
  • Improved User Defined Functions : This helps in creating complex calculations over cells of the workbook.
  • Manage Service Applications : The Service can be enabled in the Central Administration console.
  • Windows PowerShell : Excel Services PowerShell commands aids in Installing/Configuring in Power Shell Command Prompt without going to the Central Administrator console. This also helps in automating the build and setup of the application.
  • Trusted Locations : Ensures that all the Workbooks that get loaded are Trusted. These locations are created by default.
  • Unattended Service Account : This is a special low privilege account usually will have only Read access. This will be used when the Authentication has been set to None.
  • Client Applications: Apart from using the regular Excel services APIs, the service can be exposed through REST (Representational State Transfer) API. You can embed the excel content onto other office apps or on web. Users can also use JSOM or ECMAScript to share Excel content to allow interactive behavior through web browsers.
  •  
    Architecture:
    SharePoint_ExcelServicesArchitecture
    • Excel Calculation Services :  is the engine that loads the spreadsheet and workbook, calculates the spreadsheets, updates external data, and maintains session state for interactivity.
    • Excel Web Access :  is a Web Component component that delivers the Excel workbooks.
    • Excel Web Services : is a Web Service hosted in SharePoint that provides various methods for developers to create custom applications that are built on the Excel workbook.

    To setup and start using the Excel Services you need to understand the following concepts:

    Connections and Excel workbooks:
    Excel workbooks can contain two types of connections.
    • Embedded connections are stored internally as part of the workbook.
    • Linked connections are stored externally as separate files that can be referenced by a workbook. Linked connection files can be centrally stored, secured, managed, and reused.
    Office data connection files (.odc) are the connection files used in Excel Services Application.    

    Trusted File locations Excel SA only loads workbooks from trusted file locations. An administrator has to explicitly mark a directory (SharePoint document libraries, UNC paths, or HTTP Web sites) as a trusted file location. These directories are added to a list that is internal to Excel SA.

    Trusted Data providers These are external databases that Excel Calculation Services is explicitly configured to trust when it is processing data connections in workbooks.

    Trusted Data connection libraries These are SharePoint document libraries that contain Office data connection (.odc) files.

    Authentication: Excel Services Application accesses external data sources by using a delegated Windows identity.
    There are three modes of Authentication to the data source
    • Windows Integrated (Requires servers to be on same domain. Kerberos setup required if there is a double hop)
    • Secure Storage
    • None

    Cache: 
    Excel Services comes with a cache to prevent the load/refresh of the data from the data source for every request. However at times this cache may have to be reloaded for every request. This can be enabled to reload on every open. This will help if different data is to be shown for different set of users. A cache expiration time also can be set to invalidate the cache and reload new data.


    References:

    Tuesday, September 28, 2010

    Dynamic Lanugage Runtime

    The dynamic language runtime (DLR) is a runtime environment which sits on top of the common language runtime (CLR) and provides a set of services for dynamic languages to run on .Net framework. It includes a set of Libraries and constructs to allow the objects to be identified at run time compared to the statically typed languages like c# where the object types have to be defined at compile time. 
    Scripting/Interpreted languages like JavaScript, PHP can be good examples for Dynamic Languages.

    Other Popular examples are Lisp, Smalltalk, Ruby, ColdFusion

    Primary Advantages of DLR
    • Simplifies Porting Dynamic Languages to the .NET Framework
    • Enables Dynamic Features in Statically Typed Languages
    • Enables Sharing of Libraries and Objects
    • Provides Fast Dynamic Dispatch and Invocation

    Architecture
    The main services DLR offers to CLR include the following:
    • Expression trees: Used to represent Language semantics
    • Call site caching: It caches information about operations and types if already executed, so as to achieve faster processing.
    • Dynamic object interoperability: Provides a set of classes for the language implementers to use & extend their interoperability with .net

    Thursday, September 23, 2010

    Overview of Entity Framework 4.0


     "The Entity Framework bridges the gap between how developers commonly manipulate conceptual objects (customers, orders, products; posts, tags, members; wall posts, private messages, friend connections) and the way data is actually stored (records in database tables).  The technical term for a tool that provides this abstraction is object relational mapper (ORM)."

    This blog gives a gist of what EF (Entity Framework 4) has to offer and how to program against it.

    Best place to start http://msdn.microsoft.com/en-us/data/aa937723.aspx 

    For beginners, I would recommend reading http://blogs.msdn.com/b/adonet/archive/2010/07/19/absolue-beginners-guide-to-entity-framework.aspx

    I had been going through Entity framework lately to implement it in our next Project. To my surprise, the programming with Data (DB) has become very simple using this framework. It helps the developer to focus more on understanding the Business Domain and model the Business Entities than worry about the way to store and access the Database.

    The framework provides ways to generate DB directly from the Modelled Entities. This approach is called Model-First approach and is usually recommended. However vice versa, DB to Entities can also be created. This helps if you already have the DB ready & still want to leverage on using the Framework.

    LINQ queries or Lamda expressions are mostly used to perform CRUD operations against Business Entities instead directly against the Database.

    If you open the .edmx file (Entity Model file) in an XML editor then you will basically see the following sections.
        * Storage Model - Defines the Entities, Entity Sets and Associations. All information required to create the Database will be picked from here.
        * Conceptual Model - Defines the Entities, Entities Sets and Associations that will be consumed from the Business Layer. Information for modelling (Diagram) will be picked from here.
        * Mappings/Associations - Mappings between the Storage and the Conceptual Model is defined here.

    EntitySet is a pluralized version of the Entity.Few base classes that you need to be aware of are
        * ObjectContext is a base class for the Entity Container. Used like an container for Entites.
        * EntityContext is a base class for the Entity class.

    By default, EF uses Lazy loading approach. It executes the queries as a single batch of commands. However there are ways to make explicit execution of the Query.


    Few commands to Query, Update are like follows:
    ctx.Contacts.Where(c=>c.SalesOrderHeaders.Any()).ToList()
    ctx.Customers.AddObject(customer);
    ctx.SalesOrderDetails.DeleteObject(order);
    ctx.SaveChanges() - Triggers the execution of the Query.

    There are 3 ways of programming against the Entitiy Framework.
    • LINQ to Entities - Write LINQ queries to perform operations on Entities.
    • Entity SQL - Use SQL strings as commands. However you are writing commands against the Entities and not DB.
    • Query Builder - Use the Methods provided with Entities Framework instead LINQ.

    There are times when you would like to do some complex set of operations on a varied set of tables while interacting with Entities. This is when you could leverage on Entities to Stored Procedures. However there are some limitations on using Stored Procedure with Entities. (Something like if one of the operation, say Insert is connected to Stored Proc then other operations also have to be linked through Stored Proc).

    Tracing the SQL commands.
    During debug you would like to see what is the DB query that is being converted to for your LINQ operation. Usual approach is to use the SQL Profiler.

    This could be time consuming to switch between VS 2010 and SQL Server. You can leverage on the Programming model tracing using System.Data.Objects.ObjectQuery.ToTraceString and System.Data.EntityClient.EntityCommand.ToTraceString methods, which enable you to view these store commands at runtime without having to run a trace against the data source.

    LINQ TO ENTITIES
    // Define an ObjectSet to use with the LINQ query.
    ObjectSet products = context.Products;
    // Define a LINQ query that returns a selected product.
    var result = from product in products
    where product.ProductID == productID
    select product;
    // Cast the inferred type var to an ObjectQuery
    // and then write the store commands for the query.
    Console.WriteLine(((ObjectQuery)result).ToTraceString());


    ENTITY SQL
    // Define the Entity SQL query string.
    string queryString =
    @"SELECT VALUE product FROM AdventureWorksEntities.Products AS product
    WHERE product.ProductID = @productID";
    // Define the object query with the query string.
    ObjectQuery productQuery =
    new ObjectQuery(queryString, context, MergeOption.AppendOnly);
    productQuery.Parameters.Add(new ObjectParameter("productID", productID));
    // Write the store commands for the query.
    Console.WriteLine(productQuery.ToTraceString());


    QUERY BUILDER
    int productID = 900;
    // Define the object query for the specific product.
    ObjectQuery productQuery =
    context.Products.Where("it.ProductID = @productID");
    productQuery.Parameters.Add(new ObjectParameter("productID", productID));
    // Write the store commands for the query.
    Console.WriteLine(productQuery.ToTraceString());


    You can retrieve objects from Entities using GetObjectByKey and TryGetObjectByKey methods on ObjectContext. This will return an object with the specified EntityKey into the object context. When you use GetObjectByKey, you must handle an ObjectNotFoundException.

    The abstraction EF

    http://msdn.microsoft.com/en-us/library/cc853327.aspx
    http://blogs.msdn.com/b/adonet/archive/2008/02/11/exploring-the-performance-of-the-ado-net-entity-framework-part-2.aspx

    What one must do to improve performance is to use Compiled LINQ to precompile the LINQ queries to ensure faster operations.
    http://msdn.microsoft.com/en-us/library/bb896297.aspx will provide more information on this.


    EF gets only the Entities data without it's related assoicated entities data. Ex: There may be a case where you would want to retrieve SalesPersons along with their SalesOrders Details.

    There are ways you could inform the EF to retrieve all the related entities's data so that you do not end up using a foreach loop and trying to fill the related entities data. This would have resulted in far too many DB calls.


    Using Query Paths you can preload the related Entities.

    //When a n-level details have to be retrieved.
    var contacts = (from contact in context.Contacts.Include("SalesOrderHeaders.SalesOrderDetails")
    select contact).FirstOrDefault();

    //When unrelated tables has to be included.
    ObjectQuery query =
    context.SalesOrderHeaders.Include("SalesOrderDetails").Include("Address");


    Security

    Can't end the article without mentioning about Security. I Just evaluated on the classic problem of SQL Injection. Your old techniques of having Parameterized Query is still valid in Entity Framework.

    --SQL Injection possible.
    context.ExecuteStoreQuery("select * from Products where pid = {0}", 1);
    --Guarded against SQL Injection
    context.ExecuteStoreQuery("select * from Products where pid = @p0", new SqlParameter { ParameterName = "p0", Value = 1 })


    Overall, I am sure this is going to reduce the developer's work however adds a little more decipline to ensure that developers do not just plainly treat the entities just as a set of tables.

    So far that's it I was able to read and evaluate about, Entity Framework. Hopefully I will add an advanced version of this article where I will try to touch base on the Transaction and Concurrency related stuff and little more detail on the coding with more snippets. Till then, Happy blogging!! :)

    Don't forget to start reading about http://msdn.microsoft.com/en-us/data/aa937723.aspx

    Saturday, September 11, 2010

    Windows Server AppFabric Caching Framework


    Windows Server AppFabric Caching, also called as Velocity is a framework for providing a unified Cache. Application Caching is nothing new and has been around for many years. It definitely saves those milliseconds or even more depending on the number of concurrent users fetching the data.

    Earlier Cache used to be part of the Web Servers. Later, they moved into the Application Servers. However, the capabilities of the caching depended on the capacity of the Application server. With more and more concurrent users, it became important to have a much bigger caching servers. In order to prevent any bottlenecks on these servers, there was a need for a Caching framework which could provide all the capabilities of a load balanced Web Servers.




    The following are some of the functionalities that was expected and delivered from the Velocity framework:
    • Load Balance of the Cache Servers
    • Provision to scale dynamically without having to stop the applications
    • High availability in case some Cache Servers goes down
    • Maintaining consistency in data copies stored across Cache Servers
    • Provide a mechanism to invalidate the Cache when the actual data store gets changed

    To Install & Configure the framework follow the link http://msdn.microsoft.com/en-us/library/ff383731.aspx 

    There are two places of storing the configuration.
    • Network Shared folder - Usually smaller applications (1-5 Servers)
    • SQL Server - For larger Enterprises (greater than 5 Servers)
    Configuration can be done either Programmatically or by using the Windows PowerShell. The AppFabric/Velocity framework related commandlets will be installed as part of the framework.

    The following are some of the terminologies that are used in Velocity Framework
    • Cache Server
    • Cache Host - Windows Service
    • Cache Client - Web Application Accessing the Cache 
    • Local Cache - Data is stored in memory of the Web Application
    • Server Cache - Data is serialized and saved in servers other than Web Application Server
    • Lead Host - Responsible for storing Cache and also to co-ordinate with other Hosts for managing the integrity of the other Cache Servers. 
    • Cache Cluster - A set of Cache hosts working together to provide a unified Cache view

    There are two ways of partitioning a Cache. You could configure data to go into one of these partitions to effectively manage the cache for performance and also to isolate the invalidating effect on other cache data.
    1. Named Cache
    2. Named Region


    Memory management
    There are two ways of invalidating the Cache.
    • Timeout
    • Notification (Default polling interval of 300 secs) - Will check for any notifications programmatically.

    Periodically, invalidated cache gets cleaned up for effective Memory management. However, there may be cases where the framework may choose to remove cache data when there is a crunch for memory. You application may get exceptions when they are programmed by completely expecting the data to be in cache. Hence it is a must to write your applications for such events.

    The eviction strategy used by Velocity framework is LRU (Least recently) used. i.e. Old data gets destroyed first.

    High availability is achieved by making copies of the Cache data. The number of copies that needs to be maintained can be configured.

      
    Security
    Security is done at Transport level. Velocity framework can be configured to allow only certain user-context to access the cache servers. You should allow the user context of the Web Application - Application Pool to have access to the Cache servers. 

    Lastly, ASP.net 4.0 applications can leverage on the Velocity framework for storing the Session states.
    Hopefully, I have touched some concepts of the new caching framework. Can't wait to implement this on my next project.
    References:
    http://msdn.microsoft.com/en-us/library/ee790954.aspx

    Saturday, August 21, 2010

    Rapid Development of Web Pages

    Yesterday, I set myself to develop a simple web application and targeted an hour for it. I used VS 2010 for my development purpose.

    The application will do the following:
    • List a set of users
    • List a set of books
    • Perform a simple search on the Users vs. Books borrowed (through IDs)
    • Allow users to borrow & return books. Entries to be made accordingly
    The output turned out to be something like this:


    The following three layers were considered:
    1. Database
    2. Data Access Layer
    3. UI/Presentation Layer
    For brevity, I chose to ignore the Business Layer and any Service Layer.

    Database Layer
    Here I created a DB called Library and added three tables with the following constraints. The image below is self explanatory as to what each of these table do.

    Data Access Layer
    This was going to be a class library. I could have chosen to have this merged with the Web Application. But, thought adding some amount of modularizing won't hurt. 
    Here I chose to add an ADO.Net Entity Data Model. This will give a data access view of the tables in your .Net application. Follow the steps below to create it:
    Add New Item on the DAL > Visual Studio C# Items > Data > ADO.Net Entity Data Model > Specify a valid Model name say, LibraryModel.edmx and go to the next page.
    Select Generate from Database and go to the next page.
    Select the Database connection and rest leave it with defaults. You may want to change the Identifier for the connection string that will go to App.Config/Web.Config. Say, LibraryEntities
    Select the Tables you would want to make it visible in the .Net application and specify the Model namespace say, LibraryModel and click on Finish.

    Your class library should look something like this:

    Now, I simply created a simple Facade to wrap some operations of the DB operations. I called that class as LibraryManager.cs
    This class used Linq to Entities for faster development towards the tables. 

    The file looked something like this:

    using System;
    using System.Collections.Generic;
    using System.Linq;
    using System.Text;

    namespace DataAccess
    {
        public class LibraryManager
        {
            private LibraryEntities dbContext;

            public LibraryManager()
            {
                dbContext = new LibraryEntities();
            }

            public List GetBooksList()
            {
                var query = from p in dbContext.tblBooks
                            select p;
                return query.ToList();
            }

            public List GetUsersList()
            {
                var query = from p in dbContext.tblUsers
                            select p;
                return query.ToList();
            }

            public bool AddBook(tblBook book)
            {
                dbContext.AddTotblBooks(book);
                dbContext.SaveChanges();
               
                return true;
            }

            public bool AddUser(tblUser user)
            {
                dbContext.AddTotblUsers(user);
                dbContext.SaveChanges();

                return true;
            }

            public bool BorrowBook(int nBookId, int nUserId, DateTime date)
            {
                tblUserBook userBook = new tblUserBook();
                userBook.nBookId = nBookId;
                userBook.nUserId = nUserId;
                userBook.sBorrowedDate = date;
                dbContext.AddTotblUserBooks(userBook);
                dbContext.SaveChanges();

                return true;
            }

            public List SearchBorrowedItemsByBookAndUser(int nBookId, int nUserId)
            {
                var query = from p in dbContext.tblUserBooks
                            where (p.nBookId == nBookId && p.nUserId == nUserId)
                            select p;
                return query.ToList();
            }

            public bool ReturnBook(int nUserBookId, DateTime date)
            {
                tblUserBook userBook = dbContext.tblUserBooks.Single(p => p.nId == nUserBookId);
                userBook.sReturnedDate = date;
                dbContext.SaveChanges();

                return true;
            }
        }
    }
     

    UI/Presentation Layer
    Here I just created a simple Web Application and added a new item "Web Form using a master page" called User.aspx which is tied to the default master page.

    In order to access the Database through the Entity Framework, you need to have the connection string set in your web.config.
    Copy the connection string from the App.config of your Data Access Layer to your web.config

    User.aspx had the following controls
    Grid views to show a list of Users, Books & User vs Books Search results
    Other simple controls to perform Borrow and Return book feature

    Important to highlight is to see how easy it is to bind the List returned from DAL into the Grid.

    Observe the ones highlighted in bold below.

    Users.aspx
    <%@ Page Title="" Language="C#" MasterPageFile="~/Site.Master" AutoEventWireup="true" CodeBehind="Users.aspx.cs" Inherits="Library.Users" %>
    <asp:Content ID="Content1" ContentPlaceHolderID="HeadContent" runat="server">
    </asp:Content>
    <asp:Content ID="Content2" ContentPlaceHolderID="MainContent" runat="server">
        <asp:GridView ID="gvUsers" runat="server" AutoGenerateColumns="false">
        <Columns>
           <asp:BoundField HeaderText="ID" DataField="nId" />
           <asp:BoundField HeaderText="Name" DataField="sName" />
        </Columns>
        </asp:GridView>

        <br />

        <asp:GridView ID="gvBooks" runat="server" AutoGenerateColumns="false">
        <Columns>
           <asp:BoundField HeaderText="ID" DataField="nId" />
           <asp:BoundField HeaderText="Name" DataField="sName" />
           <asp:BoundField HeaderText="ISBN" DataField="sISBN" />
           <asp:BoundField HeaderText="Is Active" DataField="bActive" />
        </Columns>
        </asp:GridView>

        <br />
        <asp:Label ID="lblBorrowMessage" runat="server" Text=""></asp:Label>
        <br />
        <table>
        <tr>
        <td><asp:Label ID="lblSearchBookId" runat="server" Text="BookId"></asp:Label></td>
        <td><asp:TextBox ID="txtSearchBookId" runat="server"></asp:TextBox></td>
        <td><asp:Label ID="lblSearchUserId" runat="server" Text="UserId"></asp:Label></td>
        <td><asp:TextBox ID="txtSearchUserId" runat="server"></asp:TextBox></td>
        <td><asp:Button ID="btnSearchBookUserId" runat="server" Text="Search"
                onclick="btnSearchBookUserId_Click" /></td>
        <td><asp:Button ID="btnBorrowBook" runat="server" Text="Borrow" onclick="btnBorrowBook_Click"
                 /></td>
        </tr>   
        </table>
        <br />
        <asp:GridView ID="gvBookUserId" runat="server" AutoGenerateColumns="false">
        <Columns>
           <asp:BoundField HeaderText="ID" DataField="nId" />
           <asp:BoundField HeaderText="Borrowed Date" DataField="sBorrowedDate" />
           <asp:BoundField HeaderText="Returned Date" DataField="sReturnedDate" />
        </Columns>
        </asp:GridView>

       
        <br />
        <asp:Label ID="lblReturnMessage" runat="server" Text=""></asp:Label>
        <br />
        <table>
        <tr>
        <td><asp:Label ID="lblReturnBook" runat="server" Text="BookUserId"></asp:Label></td>
        <td><asp:TextBox ID="txtBookUserId" runat="server"></asp:TextBox></td>
        <td><asp:Button ID="btnReturnBook" runat="server" Text="Return"
                onclick="btnReturnBook_Click"/></td>
        </tr>   
        </table>
        <br />


    </asp:Content>



    Users.aspx.cs
    using System;
    using System.Collections.Generic;
    using System.Linq;
    using System.Web;
    using System.Web.UI;
    using System.Web.UI.WebControls;

    using DataAccess;

    namespace Library
    {
        public partial class Users : System.Web.UI.Page
        {
            private LibraryManager libraryMgr;

            protected void Page_Load(object sender, EventArgs e)
            {
                libraryMgr = new LibraryManager();
                gvUsers.DataSource = libraryMgr.GetUsersList();
                gvUsers.DataBind();
                gvBooks.DataSource = libraryMgr.GetBooksList();
                gvBooks.DataBind();

                lblBorrowMessage.ForeColor = System.Drawing.Color.Black;
                lblBorrowMessage.Text = "";
            }

            protected void btnSearchBookUserId_Click(object sender, EventArgs e)
            {
                int nBookId = Convert.ToInt32(txtSearchBookId.Text.Trim());
                int nUserId = Convert.ToInt32(txtSearchUserId.Text.Trim());
                gvBookUserId.DataSource = libraryMgr.SearchBorrowedItemsByBookAndUser(nBookId, nUserId);
                gvBookUserId.DataBind();
            }

            protected void btnBorrowBook_Click(object sender, EventArgs e)
            {
                int nBookId = Convert.ToInt32(txtSearchBookId.Text.Trim());
                int nUserId = Convert.ToInt32(txtSearchUserId.Text.Trim());
                if (libraryMgr.BorrowBook(nBookId, nUserId, DateTime.Now))
                {
                    lblBorrowMessage.Text = @"Data has been successfully recorded. Have fun reading.";
                    lblBorrowMessage.ForeColor = System.Drawing.Color.Green;
                }
                else
                {
                    lblBorrowMessage.Text = @"There was an error while trying to Borrow the Book.
                                            Contact the Administrator.";
                    lblBorrowMessage.ForeColor = System.Drawing.Color.Red;
                }
            }

            protected void btnReturnBook_Click(object sender, EventArgs e)
            {
                int nBookUserId = Convert.ToInt32(txtBookUserId.Text.Trim());
                if (libraryMgr.ReturnBook(nBookUserId, DateTime.Now))
                {
                    lblReturnMessage.Text = @"Data has been successfully recorded. Hope you had fun reading.";
                    lblReturnMessage.ForeColor = System.Drawing.Color.Green;
                }
                else
                {
                    lblReturnMessage.Text = @"There was an error while trying to Return the Book.
                                            Contact the Administrator.";
                    lblReturnMessage.ForeColor = System.Drawing.Color.Red;
                }
            }
        }
    }

    Just to ramp up, things that helped me in achieving my one hour target was:
    • Using Mangement Studio for SQL Server 2008
    • Creating ADO.Net Entity Data Model
    • Using LINQ to do CRUD on the table data
    • Using Grids and Binding data
    Two links worth mentioning are:
    http://www.telerik.com/help/silverlight/consuming-data-linq-to-ado-net-entity-data-model.html
    http://www.mikepope.com/blog/DisplayBlog.aspx?permalink=1419&count=no 

    Well just a little more than a hour, i was able to develop what I set out for. So Mission accomplished :)

    Thursday, August 19, 2010

    Centralized Content Type Hub for publishing

    This article will demonstrate the steps to create a central repository for storing all the content types in a single Site collection and allow it to be used by all the Site Collections (from any Web Application in the farm).

    This feature is new in SharePoint 2010. In MOSS third party components had to be used to achieve this kind of support.

    In order to achieve the below steps needs to be followed:
    1. Configure the Metadata Service Application (Properties). Type the site collection URL which will be used for publishing all the Content Types. Make up your mind to use that site collection as a Hub as there is no modifying feature of the hub. You will have to create another Metadata Service Application to change the hub.

    Ex of URL is http://mtalt005:85/sites/hub/


    2. Configure the Connection of the Metadata Service Application (Properties). You can find this below the Managed Metadata Service Application. Enable the Site collections to be able to consume Content Types from the above hub.



    3. Enable the Site Collection Feature "Content Type Syndication Hub" of the Hub Site collection.



    4. Now create a Content Type in the hub site collection. Site Settings > Site Content Types > Create
    Say, you give an AircraftImage whose Parent Content Type is an Image from the Digital Asset Content Types group.

    5. Once the Content Type is created you will be able to publish them for other site collections to consume. To publish, go to the content type created and under the settings "Manage Publishing for this content type" select OK. If it is already published then you will have options to unpublish (existing content types consumed will de-linked and become local copies) or republish (in case of updates to the content type). All subscriptions are driven by the timer jobs.

    The below Jobs have to be started forcibly (through Run Now) or wait till the scheduled time.
    Central Admin > Monitoring > Review Job Definitions.
    Content Type Hub 
    Content Type Subscriber 

    Note: You may still need to wait until the previous jobs are run. It is a queue basis. So have patience.
    If you don't see the jobs running then, Ensure that the SharePoint timer is running. Check your Windows Service status of "SharePoint 2010 Timer".
    This is the cpu intensive service. I always prefer to keep it OFF on my dev box.

    6. Once the Synchronization is done you should be able to see the Content Types in any of the Site collections (Team Sites) from any web application that has been configured for the Subscription (i.e the Content Type Subscriber Job for that web application has run).
    To see the subscribed list of Content Types,
    Site Settings > Site Collection Administration > Content Type Publishing



    With this now you are ready to have a centralized Content Type publishing. This will help you in managing the updates for each of the content types.

    Tuesday, August 10, 2010

    Usability Enhancing features in SharePoint 2010


    Some of usability features that I came across in SharePoint 2010:

    You can click on the Edit (Page Ribbon button) and add content inline while viewing the page. Use the format Text to help in formatting the font and color of the text.



    The Insert Ribbon at the Edit mode gives a nice set of capabilities that enhances the authoring capabilities. 


    You can now,
    • Add an external Link, On click of which will browse the page to that link.
    • Point to an Internal SharePoint List Item. To do this just type [[ in the Edit mode. You can follow the intellisense to the required list item. In view mode, On click of it, will show the List item in a Ajax enabled dialog. Hence keeps you within the context of the page and also showing the content of the List item. You Upload a document in a dialog and provide the link to it.
    • Upload an image into one of the document libraries from local computer and get them rendered inline on the page.
    • Link an image from an external (Internet) source and render them inline on the page.


    The Designer ribbon gets enabled when working with Images. You could do some basic digital manipulation on images like setting the aspect ratio, borders & positioning on the page.


    These are some of the capabilities that has made Rich content authoring in SharePoint 2010 easy.

    Friday, August 6, 2010

    Windows Workflow Foundation 3.5 Overview

    Any business process with predefined set of tasks/activities is a good candidate for Workflow. Especially, that takes time between each task like approval process, waiting on external events to be triggered like file drop or based on a new request from client through say SharePoint list item added.

    Without workflow, the processes (applications) had to have its instance always waiting which wasted a lot of CPU cycles. WF provides capability to share the data between each task of the workflow instance. Ability to persist the state enables the application to be stopped when waiting on an event and again be invoked when the event occurs. It is not required for all the activities in a particular instance of a workflow to be executed by the same thread or even the same process.

    Good Use Case would be for ASP.Net pages. They can be modeled in a WF (ex: purchase WF). It gives capabilities to change the WF dynamically without having to touch the pages.

    Advantages of WF Framework
    • Componentizing of code to execute a different chunk of software at each level
    • Tools to create and modify workflows graphically
    • Ability to monitor the status of the Workflow
    • Rich error handling and provision for rolling back an activity
    • Dynamically modifying the workflow by adding a task

    The fundamental components of WF are the following:
    Activity: A unit of work. (Some Business Process Managements use the term 'Task')
    • Workflow: A set of activities. Defines the flow of the activities based on business logic.
    • Workflow Designer: a graphical tool, typically hosted by Visual Studio that can be used to create and modify WF workflows and activities.
    • Base activity library: Standard out of the box activities provided by WF.
    • Runtime engine: a WF-supplied library that executes workflows. The runtime engine also provides other services, such as mechanisms for communicating with software outside the workflow. I.e. Assembly method, WCF Service etc.
    • Runtime services: a group of services that workflows can use, including support for transactions, persisting a workflow’s state, tracking its execution, and more.
    • Host process: a Windows application that hosts the WF runtime engine and any workflows it executes.
     There are two kinds of WF
    1. Sequential (has a start and end)
    2. State Transition (jumps from one state to another based on the actions)
    Checkout the msdn for the list of Activities of the base activity library.

    You can Developing Workflows using Visual Studio designer or by writing code directly writing code.

    You will use the following namespaces while coding.
    System.Workflow.Activities, System.Workflow.ComponentModel, and System.Workflow.Runtime

    Snippet to create a Workflow would be as follows:
     using System.Workflow.Activities;
    public class ExampleWorkflow : SequentialWorkflow
    {

    }

    Workflows can also be defined using the XML-based extensible Application Markup Language (XAML).
    ClrNamespace="System.Workflow.Activities"Assembly="System.Workflow.Activities" ?><="" x:class="ExampleWorkflow" font="">xmlns="Activities" xmlns:x="Definition">

    Simple way to create an activity in C# is
    using System.Workflow.ComponentModel;
    public class ExampleActivity : Activity
    {

    }
     Common Hosting Program will be something like this,
    using System.Workflow.Runtime;
    class ExampleHost
    {
        static void Main()
        {
            WorkflowRuntime runtime = new WorkflowRuntime();
            runtime.StartRuntime();
            runtime.StartWorkflow(typeof(ExampleWorkflow));
            …
        } 
    }

    WF ships with a set of services that allow the runtime to execute within ASP.NET applications.

    The ASP.NET-based host that ships with WF relies on SQL Server for persistence.

    The WF runtime acts as an intermediary for all communication with all workflows. i.e even for unloaded workflow which are waiting on an input/event to execute the activity.

    To communicate with other objects in the same Windows process, a workflow can use two activities in the base activity library.

    1. CallExternalMethod activity: allows calling a method in an object outside the workflow
    2. HandleExternalEvent activity: allows receiving a call from an object outside the workflow.
    In order to communicate with the WCF there are two standard Activities
    1. Send: Sends a request using WCF, then optionally waits for a response.
    2. Receive: Receives an incoming request via WCF, then sends a response.

    WF ships with some built-in support for roles defined using Windows accounts, Active Directory, and ASP.NET, but other role management systems can also be used.

    Windows Workflow Foundation and Windows SharePoint Services:
    Using standard WF tools, such as the Workflow Designer, developers can create workflow applications that address document collaboration and other kinds of information sharing.

    Conclusion:
    A very generic framework for designing and developing of Business Processes has been created in the form of Windows Workflow foundation. This should help the developers to speed up the programming and also ensure a better utilization of the Server Process CPU by unloading the idle Processes waiting on external inputs.
    Using the framework, developers can also write tools that can help the Business users to track the status of each of the Workflow instances for better management purpose.

    Wednesday, August 4, 2010

    Claims based security model in SharePoint 2010

    In this article I will try to touch base briefly on the Claims based Security Model support in SharePoint 2010.

     
    SharePoint comes with two kinds of authentication when you are creating a new Web Application.
    1. Classic Mode Authentication (default)
    2. Claims Based Authentication

     

     
    Classic Mode is just the Windows based authentication and is used for backward compatibility.
    The new mode of authentication i.e Claims Based works around the concept of an Identity which is based on the standards of WS-Federation, WS-Trust and Protocols like SAML (Security Assertion Markup Language)
    It provides a generic way for applications to acquire identity information from users in/across organizations and also on internet.
    Identity information is contained in a security token, often simply called a token. A token contains one or more claims (trusted information) about the user. This information stays with them throughout their session.

     
    This is developed on the Windows Identity framework (WIF). Features of claims-based identity include
    • Authentication across users of Windows-based systems and systems that are not Windows-based.
    • Multiple authentication types.
    • Stronger real-time authentication.
    • A wider set of principal types.
    • Delegation of user identity between applications. (Can resolve Double Hop issues easily)

     
    Out of the box, SharePoint supports authenticating using Windows & Forms (both supported in MOSS), LiveID/OpenID. However, integrating with custom Authentication providers is easily possible as long as the application can trust the Issuing Authority of the Security Tokens.

     
    Definitions of some of the concepts that you need to be aware of are as follows:
    • Identity: security principal used to configure security policy
    • Claim: attribute of an identity (Login Name, AD Group, etc)
    • Issuer: trusted party that creates claims
    • Security Token: serialized set of claims in digitally signed by issuing authority (Windows security token or SAML)
    • Issuing Authority: issues security tokens knowing claims desired by target application
    • Security Token Service (STS): builds, signs and issues security tokens
    • Relying Party: application that makes authorization decisions based on claims

     
    There are two cases of Claims Incoming and an Outgoing. The scenarios are different in the way they get authenticated or validated. See the images below from the MSDN article.
     
     
    Hope this gives you a start in understanding a very high level concepts of claims based Security Model in SharePoint 2010.

    Tuesday, August 3, 2010

    HTML5! The next generation of HTML

    Current browser applications have gone through a phenomenal change. The demand for the market to have richer client has pushed applications like Silverlight, Flash, Java Applets etc. onto the browsers.

    However, HTML is still a base for most of the Web Applications (including SharePoint), when it comes to support for all browsers. In order to reduce the dependency of such third party plugins, it was necessary to add more controls into the HTML vocabulary.

    HTML5 is still a work in progress. However, most modern browsers have some HTML5 support. Sadly, it is not supported on IE8

    One particular thing interested me in HTML5. It’s capability to store data in the client.
    It offers two new methods for storing data on the client:
    • localStorage - stores data with no time limit
    • sessionStorage - stores data for one session

    Earlier, cookies were used to store data. It only stored limited information that represents the user. In most cases just a sessionID was stored in Cookie while the actual data was stored in Sessions at Server. The reason of doing that was to reduce the overhead of passing the data between the client and server for every request.

    In HTML5, the data do not have to be passed on by every server request, but used ONLY when required. It is possible to store large amounts of data without affecting the website's performance.

    The data is stored in different areas for different websites, and a website can only access data stored by it.

    I am sure this will open the doors for a lot of offline capabilities of the web sites.

    Monday, August 2, 2010

    Overriding SharePoint My Profile: Ask me About (Expertise) links

    My Profile home page comes with a Ask Me About web part. Which is nothing but a list of Expertise each user would have updated the User Profile to let the others know about the expertise.
    Any users who browser through the profile and clicks on the Expertise link then it will open up in the Noteboard web part. This posts the question to the person who has updated the expertise.
    If the Noteboard web part is disabled then it opens the question in the Mail client available in the machine.

    There was some bug in this that it opens up a mail client and also a blank browser window. On closer observation found that clicking on the Expertise link was calling a Java Script function NavigateToNoteboard().
    This javascript was coming from the WebPart whose page, I was not able to find or did not want to change as it was a standard web part being used every where.

    This is where I figured out that you could override javascript and the behavior is that the last javascript in a page overrides the others. So a simple hack of overriding this javascript immediately after the Web Part control registration solved the problem.

    Here is the link that I used in my overriden Javascript for opening the question in  a mail client.

    http://www.webmasterworld.com/javascript/3290040.htm


    document.location.href = "mailto:" + toAddress + "?subject=" + subject;

    Hope this was useful.. In the above, I was able to showcase
    1. The bug that opens a blank web page when Noteboard web part is removed.
    2. Overriding the Javascript
    3. Opening a mail client through JavaScript

    Sunday, August 1, 2010

    Some Usability concerns on SharePoint 2010

    This article lists some of the UI concerns faced during my work in the last 3 weeks. Hopefully Microsoft improves the some of these in the next patch.

    1. User Profile Property view takes a lot of time to load: No idea what is going on here.
    2. User Profile AD Sync with users 35K records takes a lot of time in loading the top level tree. To even expand the tree it does go through minutes of wait time. There is no visual indicator during the wait.
    3. User Profile Property assoication with BDC: Here the default view makes the user to beleive that the association is already made.
    4. External content type picker doesnt show proper values. It just shows the Source i.e DB Name for SQL Server type. It gets very confusing when there are multiple Content Types which share the same db but map to different tables.
    5. View distorts the user id. Especially after setting the admin credentials for the Secure Storage Service. Ex: server\administrator becomes  i:0#.w server\administrator ;  This makes the user thinking if he has given the correct values or not.
    6. Changing the master page on the Search Site Collection and trying to add a web part page did not provide a scroll bar to find the Add button. Luckily the tabs were working.
    7. MySite manage keywords page is skewed.
    I am all excited to head back home from my SharePoint onsite assignment. If my next assignment is in SharePoint then I may add more stuff under this blog.

    Saturday, July 24, 2010

    Cannot Edit the Associated LOB through BCS Association

    There is no way (through SharePoint designer) to provide edit capabilities on a BCS when it is in association (through a foreign key) with another BCS.

    Read on, to know how I tried to attempt it..

    Today I was able to work on the Write back capabilities of the BCS. It is straight forward to create a BCS on a SQL Server Table.
    I just went ahead with Create All operations on my Data Source.
    In the wizard, make sure to uncheck the Read-Only property for those columns which need to be Updatable.
    Once you are done with the wizard click Finish.

    So far, so good. But, I wanted to see how BCS behaves when I have associated tables. I wanted two sets of data coming from different sources say one coming from SQL Server and other from WCF. For the testing, I just tried with another SQL Server Table which had 1:1 association with the first one.
    The first table is like tblUserExtended and the second one which had association was tblUser
    Both these tables were linked through the tblUser::nUserId and tblUserExtended::nUserId (Think of this as another source which has some extended user data, stored for another application that has to be aggregated)

    Follow the msdn article (http://msdn.microsoft.com/en-us/library/ff728816.aspx) to create the association between two BCS entities.

    The reason I did that was to have a single external list which will list columns from both User and UserExtended tables and allow the SharePoint user to edit some columns of User and some columns of UserExtended.
    SharePoint showed only the columns of tblUser. While the foreign key tblUser::nUserId represents as a tuple internally. So I could not edit columns of UserExtended in that view but association of the UserExtended tuple could be done.

    SharePoint out of the box do not support editing the associated BCS (at least for the External Content types created using SharePoint designer).

    May be there is a way to do it, but, I could not find it. Let me know if you have a solution for this.

    Friday, July 23, 2010

    Programming using BCS (Business Data Connectivity) API

    All Searches in Bing & Google for BCS API leads to writing .net connector to BCS. But today I had to write some complex component which required me to write code against BCS to retrieve LOB data.
    I know you may ask why BCS when you already have connectivity to the source of BCS. The reason being, I wanted to use BCS to work something like an Entity model and used in context of SharePoint 2010.

    Thanks to chaholl's article I was able to do perform it easily.. Here is the link for it.
    http://www.chaholl.com/archive/2010/03/15/creating-a-web-service-to-access-bcs-data-in-sharepoint.aspx

    Today, I also downloaded the SharePoint 2010 SDK. It comes with a lot of samples and help when programming with SharePoint. It hardly takes like 200MB of your hard drive and is definitely worth that space, if you are planning to do a lot of SharePoint coding.

    Something that I came across in the SDK articles that is worth the mention,

    Restrictions in BDC:
    BDC does not support the ICollection or IEnumerable interfaces to represent collections in data structures, as well as the generic ICollection, IEnumerable, and IList interfaces. All collections must implement IList.
    Finder method returning multiple items, we require the return value to implement either IEnumerable or IEnumerator (except for database, where we support onlyIDataReader). BDC does not support generic versions of these IEnumerable and IEnumerator.