Wednesday, July 25, 2012

Setting Site Permissions during the Site Provisioning


Setting custom permissions during site provisioning can be a confusing affair. With some investigation, I was able to get the concepts clear. 
Below is the design we used for Groups and permissions in one of our projects during the site provisioning. Hope it helps you in your projects as well.

Note: All groups are created at the Site collection level. They will be visible at all the webs that will be created under it.

SiteCollection Feature Activation
1. Create custom RoleDefinitions (Permission Set) {ViewEditRD, ViewAddEditRD} at the Site Collections. This will get inherited at all the webs. Don’t get confused with the Permission breaking at the Web level.
2. Create an Admin group (SiteAdmins) at the SiteGroups collection. This will be used to provide admin activities on the Sub webs or Lists that will undergo inheritance breaking.

New Project Provisioning
1. Create the Project with Inheritance of the Permission broken.
2. Create the Custom Groups {PMGroup, TeamMebersGroup} for every Web created.
3. Associate the Custom Groups created at the Site Collection with Read permissions.
4. Associate the Custom Groups created at the Web Level with appropriate RoleDefinitions {ViewEditRD/ViewAddEditRD}. SPRoleDefinition defines the set of permissions permitted on SharePoint objects. 
5. Associate the SiteAdmins group at the Administrator role at the Web level.
6. Break the Permissions at the List level and apply the Required RoleAssignments based on the RoleDefinition and Groups. SPRoleAssignment class is used to bind together a Group and RoleDefinition with a SharePoint Object (web, list or a document library). 


Monday, July 9, 2012

Working with SharePoint Online using NodeJS




SharePoint Online (SPO) data can be accessed using Server Object Model (Restricted API Set as it is for Sandboxed solution), but, only when the application is hosted within the SPO site collection. For rest of the scenarios you will need to use Client Side Object Model (CSOM). 


There are three approaches to accessing SharePoint/SPO data from client applications: 
CSOM (JavaScript, Silverlight, .Net Managed)  
Web Services
REST Interfaces



Accessing remote SPO data from hosted applications using Silverlight/JavaScript is not possible as browser clearly rejects such requests treating them as cross-site scripting (also, known as XSS Attack). The only way to access such data is to either use desktop based applications or write server side code on your web server using .Net Managed, Web Services or REST interfaces. Before you can access the data you need to Remotely get authenticated to SPO site. 

The flow to remotely get authenticated is to, 
1. Send using SAML 1.1, the credentials (over an https) to the SPO STS endpoint https://login.microsoftonline.com/extSTS.srf. If the request is successful, then STS returns a Token.
2. Pass this token to SPO and fetch two cookies (called FedAuth and rtFa). 
3. You need to pass these two cookies on every request made to SPO.


Note: FedAuth cookies are written with an HTTPOnly flag. This means that client side browsers are instructed to not allow any scripts to read cookies and thereby preventing a cross-site scripting (XSS) attack.


Demonstrating with an example, I will use REST interfaces and for the sake of learning some new platform, I have chosen NodeJS (A platform built on Chrome's JavaScript runtime for easily building fast, scalable network/web applications.) as the server technology.


Steps: 
1. Create an Office 365 free trial using http://www.microsoft.com/en-us/office365/free-office365-trial.aspx?WT.z_O365_ca=Try_free-office365-trial_header_en-us . Say, your domain is mydomain.sharepoint.com
2. Navigate to the newly created SPO site http://mydomain.sharepoint.com (http is for P-plans and https is for E-plans).
3. Create a Custom List 'Contacts' and add a couple of list items.
4. Install NodeJS from http://nodejs.org/#download
5. Copy the cotents of the https://github.com/lstak/node-sharepoint.git into a folder.
6. Through a console, Navigate to the folder and execute the command $npm install sharepoint If that doesn’t work then try the following $npm install sharepoint@0.0.5
7. Create a new file under the same directory and name it as spo.js and put the code snippet provided below.
8. Execute the command $node spo.js
9. While the server is waiting on for requests make a request to http://127.0.0.1:1337/


Output: You will see 'Hello World' on the browser, but behind the scenes the server has made a request to your SPO team site and fetched the data. Verify this by checking the  logged information on the Server console. 


 var http = require('http');
http.createServer(function (req, res) {
 res.writeHead(200, {'Content-Type': 'text/plain'});

 var SP = require('./sharepoint');

 // use the domain name which you have access to.
 var spo = new SP.RestService("http://mydomain.sharepoint.com/teamsite/");
 spo.signin('prashanthkn@mydomain.onmicrosoft.com', 'password', function(err, data) {
  // check for errors during login, e.g. invalid credentials and handle accordingly. 
  if (err) {
  console.log("Error found: ", err);
  return;
  }

  // start to do authenticated requests here....
  var oList = spo.list('Contacts');

  oList.get(function(err, data) {
   data.results.forEach(function(item) { 
   console.log(item.Id); 
   console.log(item.Title); 
   });
  });  
 });

 res.end('Hello World\n');
}).listen(1337, '127.0.0.1');

console.log('Server running at http://127.0.0.1:1337/');


Once you have the data, you can decide how you would want to share it with your clients.

Go through the NPM (Node Package Manager) documentation @ http://npmjs.org/doc/install.html
Good read about SPO Authentication under the hood. http://www.wictorwilen.se/Post/How-to-do-active-authentication-to-Office-365-and-SharePoint-Online.aspx


Hopefully, this gets you started on evaluating NodeJS and SharePoint Online integration.

Thursday, March 8, 2012

Some best practices of developing SharePoint 2010 applications



Came across some nice videos on best practices from Ted Pattison. Here is the gist of what he suggests.

Development
  • Use Site template instead of the site definition. Hell during the upgrades from one version of SP to another.
  • Deploy the site templates as sandboxed solution in the staging/test environment to check out the bottlenecks and depending on the needs deploy them in production as sandboxed or farm based.
  • Use Feature Stapling. element to staple a feature to a site definition.
  • Version Features and anticipate for feature upgrades.
  • Hide the Features that have been stapled as hidden. Use the Visual studio properties windows to hide them. This ensures that they are not seen by the site admins.
  • Keep in mind about various cycles of Activation and Deactivation. Deactivation leaves behind the list instances and next time activations will break during the deployment. Ensure to write appropriate code to either delete or backup previous versions before deleting them.


Testing and Deployment.
  • Use the SPDisposeCheck utility sparingly.
  • Ensure to use the keyword using() especially to SPSite object to let the .net framework take care of the disposing. Do not dispose the SPSite object created by others that is received through the event receivers or anything like that. Ex: properties.parent.site
  • Test your environment without any SDKs installed. Probably a VM or something would be good. Keep your QA environment with at least two boxes. One for the Web/App and another for the DB Server.
  • Test your QA machines on debug mode to catch any unknown errors.
  • If you are deploying a farm based solutions prefer to use a script like PowerShell to install them.
  • Updates to web.config needs to be automated. For safe control entries you can add data to the manifest file. However for other entries use the SP Object model. See some methods of updating web.config http://msdn.microsoft.com/en-us/library/ms439965.aspx  / http://msdn.microsoft.com/en-us/library/bb861909.aspx








Sunday, March 4, 2012

Extending the debug session in SharePoint 2010


Error Message
The Web server process that was being debugged has been terminated by Internet Information Services (IIS). This can be avoided by configuring Application Pool ping settings in IIS. See help for further details.


Resolution
By default, the IIS application pool waits 90 seconds for an application to respond before it closes the application. This process is known as "pinging" the application. To resolve this issue, you can either increase the wait time or disable application pinging entirely.


To access the IIS app pool settings

  1. Open IIS Manager.
  2. In the Connections pane, expand the SharePoint server node and click Application Pools.
  3. On the Application Pools page, select the SharePoint application pool (typically "SharePoint - 80") and then, in the Actions pane, click Advanced Settings.
  4. To increase the wait time before IIS timeout, change the value of Ping Maximum Response Time (seconds) to a value larger than 90 seconds.
  5. To disable IIS pinging, set Ping Enabled to False.



for more information on the troubleshooting in SharePoint 2010 
http://msdn.microsoft.com/en-us/library/ee231594.aspx

Friday, February 10, 2012

SharePoint 2010 Scrollbar is not showing up


There are cases where your SharePoint pages renders without a vertical scroll bar or it is disabled. However, if you hold your left button down and drag downwards you will be able to see the rest of the page.
The chances of this scenario showing up are higher on non-IE browsers. Mostly occurs because of your custom master page. 


We have tried different articles out there on the internet. Here is a simple hack that did the trick. 


//Just put the below snippet in the footer control so that it will be called on all the pages.
 function adjustDivSize() {  
     var winHeight = $(window).height();  
     $('#s4-workspace').css("height", winHeight);  
   }  
 $(document).ready(adjustDivSize);  
 $(window).resize(adjustDivSize);  

Thursday, February 9, 2012

Cache implementation for your Web & WCF resource


There are different ways of implementing the caching.
1.       At the server, where you store data in-memory using the ASP.Net Cache / Application classes.
2.       In the client (Browser). This definitely gives more advantage in terms of performance.

This article describes in detail about the second type of caching.


What are the resources that can be cached?
Static resources like html, scripts - js, style sheets – css, images – gif, jpg etc
Ex: http://www.domain.com/images/banner.jpg
Dynamic resources which are REST based urls  - if you think that these resources don’t change their data very frequently then they can be a good candidate for caching.
Ex:

                                                               
What do we intend to save by caching on the browser?
Loading the server to fetch the data and pass it across
Even making an http request to the server. There are other ways to save the number of requests to server by using minification process. Where we can merge several js/css files into single file. You can read my previous articles to know more about it.


Implementing Caching
This is simple by adding Expires Header (when you know exactly when to expire) to the IIS. http://technet.microsoft.com/en-us/library/cc770661(WS.10).aspx
You can also use the Max Age header when you are not sure of the number days you need to cache the resource. The above link describes for both.


Challenges
Finding the exact Expires/MaxAge becomes hard as we will not be able to predict our bug fix cycles due to hot fixes etc.. specially in case of smaller projects.


Solution
Two common solutions both deal with manipulating the URL:
1.       Manipulate the calls to the resource by adding a query string. This requires to do a find a replace to all the locations where the resource is being referenced. Ex: http://www.domain.com/app/images/banner.jpg?rev=2345 . This becomes a painful process to manipulate it every time a resource is changed. There are very high chances that we may miss out modifying a revision query string. This also has some minor drawbacks that some proxy servers don’t cache urls which contain query string.

For REST based WCF Service, add this piece of code before the response.
OutgoingWebResponseContext ctx = WebOperationContext.Current.OutgoingResponse;
ctx.Headers.Add(HttpResponseHeader.CacheControl, String.Format("public, max-age={0}", maxAge)); //in seconds.

2.       The second more robust solution “Fingerprinting” is to inject some kind of hash into the url. Ex: http://www.domain.com/CBUSTZqTMzNV8qFU/0.jpg
Here the part of the url before the file name is the fingerprint or a hash generated based on the contents of the resource being accessed. This requires that we use some mechanism like a URL Rewrite module to do a Rewrite of requests. Ex: Search for a pattern (.*)/CBUST[A-F0-9]{32}(/.*) and do a rewrite to {R:1}{R:2} For more information on URL Rewrite http://www.iis.net/download/urlrewrite
In your aspx files, you will reference your resources like <script type="text/javascript" src="<%=Utilities.CacheBusterUrl("/_layouts/scripts/jquery-1.4.3.min.js"%>">script>

Add this piece of code in your Page_Load if it is a page.
if (!this.Page.IsPostBack)
{
this.DataBind();
}

If it is a user/web control then ensure that the page that is adding this user/web control has the above piece of code.

The Fingerprinting method has its own limitation that any resource referenced in .css file or being rendered from SharePoint library cannot be cached like this.

Implementation of the CacheBuster utility is as follows.
    public static class Utilities
    {
        private static Hashtable Md5Map = Hashtable.Synchronized(new Hashtable());

        public static string CacheBusterUrl(string sourceUrl)
        {
            try
            {
                string filesystemPath = HttpContext.Current.Server.MapPath(sourceUrl);
                string fileMD5;
                lock (Md5Map.SyncRoot)
                {
                    if (Md5Map.ContainsKey(filesystemPath))
                    {
                        fileMD5 = Md5Map[filesystemPath].ToString();
                    }
                    else
                    {
                        fileMD5 = CreateHash(filesystemPath);
                        Md5Map.Add(filesystemPath, fileMD5);
                    }
                }
                return BuildCacheBusterURL(sourceUrl, fileMD5);
            }
            catch (Exception ex)
            {
                return sourceUrl;
            }
        }
    }


Security Warning
Be aware that if your Web Application stores cookies then caching resources at the proxy servers can be a security threat. So make sure to use https whenever you decide to store cookies.

Saturday, January 14, 2012

Not able to debug SharePoint solution



There are cases when you would want to debug your SharePoint solution, but in vain, that debug pointer never gets caught. Reason, your symbols are not matching the binaries it was built with. The following are some of the common tasks we try during these times.

  • Delete the entire bin & obj folders.
  • Ensure that you are on Debug mode.
  • Do a Clean & Rebuild of the solution.
  • Put an System.Diagnostics.Debug.Assert(false); statement. During runtime, the process will throw an exception and then attach the w3wp process. Sometimes it is confusing which w3wp process to attach because there will always be more than one.
Inspite of the above, I have seen it doesn’t attach to the debugger some times.

What has worked consistently, until now is you can force the symbols (pdbs) to be loaded from the bin directory.
Tools > Options > Debugging > Symbols > Click on the Folder icon and specify the path to your bin/pdb directory.
If your build machine and the server machines are different then you can use this to cache the symbol files in the server or debugging machine.


The above steps would hold good for any .net projects. Hope this setting helps you in debugging next time.

Sunday, December 25, 2011

Minification of JS & CSS files using YuiCompressor

The process of minification includes not only to reduce the size of each of the JS or CSS files to it's bare minimum, hence by reducing the download size. But, also to stitch together multiple JS or CSS files into one JS and one CSS file. Hence reducing the number of HTTP requests made to the server for the appropriate files.


We happened to maintain several files for JS and CSS files and ensured that we used a number of files with some nomenclature like debug.01.file1.js / debug.01.file1.css
Reason, being we needed multiple files for modularization and also as we were using several third party plug-ins.
Another reason, was that ordering of processing the files does matter in some cases ex: jquerymobile js files needs to be preceded by jquery script files.


The final consolidated file will have a name of projectname.js / projectname.css


We used YuiCompressor for minification of js and css files.


Steps to setup:

Extract the zip folder to c:\   i.e. c:\yuicompressor-2.4.7


Setting Environment Variables to Java and Yuicompressor binaries.
Start > My Computer > Properties > Advanced System Settings > Environment Variable >

Append the below string to the PATH variable.
;C:\Program Files (x86)\Java\jre6\bin

type "$(ProjectDir)scripts\debug.*.js" | java -jar "C:\yuicompressor-2.4.7\build\yuicompressor-2.4.7.jar" --type js -o "$(ProjectDir)scripts\okig.mobile.js"
type "$(ProjectDir)styles\debug.*.css" | java -jar "C:\yuicompressor-2.4.7\build\yuicompressor-2.4.7.jar" --type css -o "$(ProjectDir)styles\okig.mobile.css"

Set the Post build event to run the following script. This will ensure that minification is done only in case of release mode. Version is done for both debug & release mode.
if $(ConfigurationName) == Release ( 
echo 'Minifying the JS and CSS'
type "$(ProjectDir)scripts\debug.*.js" | java -jar "C:\yuicompressor-2.4.7\build\yuicompressor-2.4.7.jar" --type js -o "$(ProjectDir)scripts\okig.mobile.js"
type "$(ProjectDir)styles\debug.*.css" | java -jar "C:\yuicompressor-2.4.7\build\yuicompressor-2.4.7.jar" --type css -o "$(ProjectDir)styles\okig.mobile.css"
)


Code analysis using SharpLinter


SharpLinter is a command line tool to automate error-checking Javascript files. It produces output that is formatted for Visual Studio's output window, so clicking on a line will locate the file and line in the IDE.


This helps you correcting from common mistakes, assumptions we make about JavaScripts.


One example, is there are some implicit conversions of types ex: null can be 0 or "" or false, undefined is some cases. Hence, when you are doing a comparison you may encounter wrong conditions.
JSLint recommends using === / !== instead of == / !=


Error console shows output as below.
 (lint) Use '===' to compare with 'null'. at character 14
 (lint) Missing radix parameter. at character 14
 (lint) Use '!==' to compare with 'null'. at character 42
 (lint) Use '!==' to compare with 'null'. at character 14


I used the following options.
$(ProjectDir)\Assemblies\SharpLinter\SharpLinter.exe
-v -y  -rf "$(ProjectDir)scripts\*.js"


jslint.global.conf
/*jslint 
browser: true, 
sloppy: true, 
nomen: true, 
plusplus: true,  
forin: true, 
type: true, 
windows: true, 
laxbreak:true
jslint*/


You can find more information in https://github.com/jamietr

Tuesday, October 11, 2011

Performance tuning rules for Web Sites

A must read article on web site performance
http://developer.yahoo.com/performance/rules.html


Friday, September 16, 2011

SEO Optimizations using Rewrite.

Came across this nice link from Scott Guthrie..

Let's assume that by default /pages/default.aspx is going to be called whenever a user types the domain name on the URL. The below are different cases where it demonstrates that the search indexes by different search engines identifies each of these as different resources and hence reducing the search rankings.


4 Really Common SEO Problems Your Sites Might Have



SEO Problem #1: Default Document
http://mysite.com
http://mysite.com/pages/default.aspx



SEO Problem #2: Different URL Casings
http://mysite.com/Pages/Default.aspx
http://mysite.com/pages/default.aspx


SEO Problem #3: Trailing Slashes
http://mysite.com
http://mysite.com/


SEO Problem #4: Canonical Host Names
http://mysite.com/pages/default.aspx
http://www.mysite.com/pages/default.aspx

Scott, explains in detail how we can increase the search relevancy by fixing these using the URL Rewrite module to either rewrite or redirect the users to a standard naming..

Read on...



Thursday, June 16, 2011

8 Steps to Create & Configure MySite in SharePoint 2010

Following are the steps to create and configure the MySite for your SharePoint 2010 Farm

  1. Create a Web Application with Classic Mode
  2. Create a new Site Collection using the MySite Host Template under Enterprise tab.
  3. Configure the User Profile Service Application > Setup My Site.
  4. Set the Preferred Search Site URL & Personal Site URL format.
  5. Add Managed Path "Personal" as a wildcard inclusion through the Manage Web Applications > My Site > Managed Paths.
  6. Switch ON the Self Service Site Collection management through the Manage Web Applications > My Site > Self Service Site creation
  7. On browsing to personal site for the first time the site is created.
  8. Have fun! :)

 

Tuesday, May 10, 2011

Production Setup of SharePoint 2010 and SSRS Load balanced with NLB

Production Setup:
  1. Two Web Front Ends (Network Load balanced) - No Central Administration
  2. Two Application Servers (Includes SharePoint, Power Pivot and SSRS Servers - Network Load balanced) - Both Servers contain Central Administration
  3. Clustered Database (Active-Passive Mode)
  4. SAN (Later switched to NAS) for storage of terabytes of Images. Product involves storing a lot of images.
All machines were Windows Server 2008 R2 with imaginably very high configuration on memory and hard drive.
SharePoint does not require you to load balance the Application Server, however in this specific case the requirement was to have high availability of the application and provision for scaling later.

First configure the DB in a clustered mode. This was already done by the DBA, so I will not cover it here.
Next, configure the Report Server in a scale out mode over a NLB

To setup and configure the SharePoint, we followed the steps mentioned in the below article with a few exceptions (Configuring Search was not required).

We made sure that all the Service applications (Excel, Power Pivot & Secured Store Service) were provisioned on both the Application Servers.

Make sure to have only the required services in the WFE. You can turn off all the Service Applications that are provisioned on the Application Servers.
Next, setup the Power Pivot on both the Application Server and re-configure the Power Pivot Service Application from the Central Administration.
http://msdn.microsoft.com/en-us/library/ee210616.aspx

Once the Configuration was done, most of the steps were custom. 

Few things that is worth of mentioning:
  1. Ensure that your domain name and web site names are not the same. We had a very hard time reverting it as users internally to the company were not able to use the Web Site. The reason, was all the requests to the site was going to the Domain Controller instead.
  2. When you are creating a Web Application, use the website name for Public URL Ex: http://mywebsite.domain.com instead of the default, Application Server name (filled by default). We could not find a way to revert this.
  3. Ensure to make appropriate DNS entries so that the web site is really internet facing. Your administrator may want to make entries in the Name server, to map the website name to the local load balancing web server, so that users (employees) within the domain need not go through internet.
 Hope this should be a good start. Good Luck! on your setup.