The SaaS Experience; Application, Knowledge, Best Practices


Our customer success team recently released our Customer Central portal, which provides a wealth of material to aid in customer success. The project has been under way for some time, but its release got me thinking about the future of the SaaS application experience.

Much of the discussion around the user experience that a software vendor provides is focused on the technology itself. This is very much a key aspect of the experience a user has with a software application, but it's very much not the only aspect of that experience.


There are three main areas of the overall experience a user has with software:



The Software Application: the product itself, and the aspects of the technology and user interface that guide a user to better understand and use it


Help and Documentation: the explicit knowledge that surrounds the application and provides instructions on its use. Whether in a help center, support portal, or knowledge-base, this documentation is generally application specific and task focused.


Community and Practice Information: the implicit knowledge of how an application can best be used to achieve a business goal. Often this information is shared through peers who have gained experience with using the application to tackle a business objective, and embodies a broader scope, including business, process, and political issues likely to be encountered.


Why bring this up?


Because I suspect we are at an interesting inflection point in the market after which these three experiences will no longer be separate.

Currently, these three main areas of experience with an application are separate. Software vendors provide the application experience, and whereas they may use a variety of third party tools to do so, it is delivered as a relatively self-contained experience.


The documentation for a software application is, similarly, delivered as another experience. Very often, a third party provider of documentation software is used that provides the ability to store and present information, and allows users to search for information in a variety of ways.


Community and best practice information is delivered in a much less standard way. Sometimes it is delivered through a single portal interface that manages a vibrant community, but often it is an amalgamation of a variety of sources, some company sponsored, and some user driven, where best practice approaches are shared.


There are a few interesting trends in each of these that makes me think that within a few short years we will be dealing with a single user experience across all three.

First, SaaS technologies, unlike installed software, easily allow integration of the experiences of the software application, and the help documentation. As both the software application and the documentation libraries are now built to be deployed in the cloud, it is a simple exercise in software engineering to embed the help documentation directly into the application experience.

Today's documentation systems are rapidly evolving to better support this approach, as it allows contextual access to key information, and thus greatly improves the users experience with the application.

Secondly, there is a significant evolution in both community software and help documentation software to move their experiences closer together. Idea exchanges, discussion forums, and collaborative voting on enhancements, all allow the viewers of once-passive documentation forums to interact with one another and exchange ideas.

Thirdly, applications themselves are beginnning to enable the sharing of best practice approaches. In demand generation and B2B marketing, the best automation programs for event management, free trial optimization, data management, lead routing, or lead scoring are being shared among marketers within, and even between, organizations.

As these transitions take place, we are evolving towards a user experience that combines all three, for the benefit of the marketer. As an example, a marketer looking to build a lead nurturing campaign should be able to, from a single experience:

  • compare notes with other marketers who have tackled similar business challenges
  • download sample nurturing campaigns as potential starting points
  • understand the precise functionality of a particular software feature being used
  • build and deploy their own lead nurturing campaign

Today's marketers, when evaluating software offerings, are increasingly pushing the evaluation envelope of demand generation software beyond the software itself. In its early days, SaaS software shifted an element of the burden to achieve success back towards the software vendor through a pricing model that was recurring, not upfront. It has been a very healthy shift for the industry and clients. Now, as marketers increasingly push to evaluate how each vendor will provide an overall, wholistic experience that allows them to achieve success, we will see this shift continue.

Software continually evolves, and at each evolutionary step, disparate parts of the software stack become seamlessly integrated for the benefit of the user. I suspect that one of the next evolutions we are about to see is the integration of the application, the documentation, and the community into a single, seamless experience for the user. I, for one, am very much looking forward to it.

I look forward to your comments. Is this a shift you can see happening? Are you beginning to evaluate software in general, and demand generation software specifically on its overall experience and path to success?

Read More...

Read xml from web page and bind the data to Silverlight module.

Some times we have requirements in SharePoint that we need to pull images from SharePoint library and display those images in Silverlight module with rich UI. But we know that Silverlight runs on client side, we can't access the SharePoint libraries in the Silverlight projects because of permission issues as well.
So, here we build an architecture that SharePoint will build the XML for us and write it on the page and Silverlight grabs the XML from the page and displays the images on the UI. How nice it is. You can get the first part i.e. get xml (image) data from the SharePoint libraries here.

Here, we are discussing about the code we need to place in Page.xaml.cs file to read xml from web page.
public Page()
{
string controlid = "divImgs"; //You can also use intiparams of Silverlight params to get the id.
InitializeComponent();

string xmlstring = string.Empty;
if (controlid != null)
{
HtmlElement ctl = HtmlPage.Document.GetElementById(controlid);
if (ctl != null)
xmlstring = (string)ctl.GetProperty("innerHTML");
}

if (!string.IsNullOrEmpty(xmlstring))
{
ProcessXML(xmlstring);
}
}
The above code is the Page constructor of the Page. Xaml.cs file.
1. Here, we are catching the control on the HTML document and reading it's HTML. if you remember, while coverting the data view web part data from HTML to XML, in the XSLT i assigned an id for a html tag called "divImgs". We are using this id here in the cs file, and reading the HTML, obviously it will get the xml data to the xmlstring variable.
2. No, we need to process the XML and bind the data to the silverlight control. This is why i am calling a function called "Process XML".
private void ProcessXML(string xml)
{
images = new List<string>();
if (xml != string.Empty)
{
try
{
StringReader textStream = null;
textStream = new StringReader(xml);

if (textStream != null)
{
using (XmlReader reader = XmlReader.Create(textStream))
{
while (!reader.EOF)
{
if ((reader.IsStartElement()) && (reader.LocalName == "slides"))
{
if (reader.HasAttributes)
{
reader.MoveToAttribute("baseUrl");
}
}
else
{
if ((reader.LocalName == "slide") && (reader.HasAttributes))
{
reader.MoveToAttribute("imageUrl");
string imageName = reader.Value.ToLower();
if ((imageName.Contains(".jpg")
|| imageName.Contains(".png")))
images.Add(reader.Value);
}

}
reader.Read();
}
}
}
}
catch (Exception ex)
{
}
}
}
3. In the above code, images is the global variable of type List<string>. We are filling this object with the image urls by reading the xml string.
4. Now by calling the ProcessXML function, we are done with getting image urls. So we have collection of image urls, and use this object to give input as the Silverlight module controls and display on the UI.

Very simple and nice.
Happy coding!!! Read More...

Show images in Silverlight by reading images from SharePoint library

Hi,

Silverlight works on client side, so it's not possible to add SharePoint libraries in Silverlight project. But we have a way that we can access SharePoint data and show it up in Silverlight in SharePoint site.

We have a requirement where we need to pull data from the SharePoint libraries and lists and show the data in Silverlight. Here i will explain a scenario where we can pull the images from image library and show it in Silverlight.



Follow the steps below to do this.

Get data from SharePoint library and make it in the form of XML.

  1. Go to the page where we want to show the Silverlight module on the site or create a new page.
  2. Open the page in SharePoint designer.
  3. Add a data view web part to the page some where in web part zone.
  4. Select the data source library from Task Pane menu.
  5. Select the library/list from the data source library where your images are present.
  6. Click on the library and select Show Data.

  7. Select the columns, which refer the Title, Image Path etc you want to show in Silverlight from Data Source details task pane.
  8. Select the option Insert select fields as "Multiple item view". So now our data view web part is filled with the data. [If you want apply some properties like show images in asc order etc..]
  9. After selected the columns, apply a filter to get only images from the library/lists. we can do this by selecting the filters option from the data view web part.
  10. Here, select the image title or url column in the field name, in comparison select contains and here enter valid formats Silverlight allows in the value field.

  11. Save the page and open it in the browser [IE].
  12. Till now what we did is, we are getting the data from the SharePoint library/Lists and showing up it in the data view web part and it renders as a table.
  13. Now, we need to change the XSLT of the data view web part to get the table format of data as the XML format. We have XSLT, so we can easily do that.
  14. Add the below XSLT code to the data view web part by going to edit for the web part and modify settings --> XSL editor.
  15. Remove text in the editor and add this code.


  16. If you observe i am pulling some values from the data view and making xml.

  17. Now, when you save the web part and see the page, you can't see anything in the data view web part, because now your data is in the format of xml.
  18. You can see this by doing view source and search for "
  19. We are done with part 1 now. i.e. getting data from SharePoint library. Now we will move to second part.
Get XML from the page and give that as input to Silverlight module:

At this point, we got the xml i.e. xml is on the page, now we need to read the xml from the page in the silverlight project and display the images as we want by applying animations, styles etc..

You can see the part 2 here.





Read More...

Flyers: Renewal Marketing Leads to Deeper Interest Profiling

In the case study on the 76ers, we'd talked about using a multi-channel campaign to build upon the emotional excitement of an NBA franchise. In doing so, however, the opportunity to build a much deeper understanding of exactly what each fan is most passionate about was not overlooked. In this case study, from Digital Body Language, the Flyers leveraged the digital body language insights to better understand which player each fan was most enthusiastic about.

This insight will be leveraged to deepen the emotional tie in future marketing campaigns by focusing communications on what each individual fan is most interested in. I hope you enjoy the read:

Flyers: Renewal Marketing Leads to Deeper Interest Profiling

The Philadelphia Flyers wanted deeper relationships with fans while also driving the highest possible rates of renewal for season-ticket holders. With careful planning, they were able to achieve both of these goals at once.

The team created personal URLs (PURLs) for each season ticket holder (such as http://www.myflyerstickets.com/johnsmith) and invited each customer to his/her personal site to complete the renewal process. On the personal page, personalized content and offers enticed the ticket holder to renew. But just as importantly, the Flyers began to build the basis for a direct online relationship with each fan.

The Flyers’s site contains rich information (including video) on players, stats, schedules, and the draft, and through the direct relationship with each season ticket holder that they have now built, the Flyers better understand each fan. By observing each customer’s unique digital body language as they look at stats, read up on players, and watch highlights, the Flyers can identify things such as favorite players and whether they prefer stats or highlight reel footage.
In upcoming seasons, the Flyers plan to leverage this rich base of knowledge based on the fans’ digital body language to continually strengthen and hone the message. Personalized video and audio messages from each fan’s favorite player and RSS feeds of stats and highlights tailored will deepen the team bond.

The Flyers increased online season-ticket renewals from 1% the previous year to 18%. Renewing online also allowed real-time processing so these numbers were available immediately to senior management opposed to the time lag that occurs with processing renewals manually. Of course, the Flyers were also able to deepen their understanding of their fan base and strengthen those relationships significantly. Read More...

replace querystring with some value in javascript

In my project, i have a requirement that we need to get url from the browser and depends on user selection, or some criteria we need to change some querystring values and reload the page with new url. Here is a small function which will do that in javascript.
function replaceQueryString(url, param, value) {
var preURL = "";
var postURL = "";
var newURL = "";

var start = url.indexOf(param+"=");
if(start > -1)
{
var end = url.indexOf("=", start);
preURL=url.substring(0,end) +"="+value;

var startRest = url.indexOf("&",start);
postURL="";
if(startRest > -1)
{
postURL=url.substring(startRest);
}
}
else
{
var delimeter = "";
preURL=url;
if (url.indexOf("?") > 0)
delimeter = '&';
else
delimeter = '?';

postURL=delimeter+param+"="+value;
}
newURL = preURL+postURL;
var index = newURL.indexOf('id=',0);
if(index > -1)
{
var Nurl = newURL.substring(0,index);
var EUrl = newURL.substr(index,newURL.length - index);
var eIndex = EUrl.indexOf('&',0);
if(eIndex > -1)
EUrl = EUrl.substr(eIndex, EUrl.length - eIndex);
//newURL = newURL.substring();
newURL = Nurl + EUrl;
}
return newURL;
}
the newUrl which builds the new url with updated querystring values. Read More...

mozilla frefox enter button problem in asp.net.

i am facing a small problem, which ate my mind almost 10 hours.
Introduction:
I am using Updatepanel [Ajax] in my site. According to my client specification, in which panel or div cursor focus on, when he/her press enter, the corresponding panel button click event will fire and request goes to server and gives response.
i am using control on my pages, because it has the property called DefaultButton. So, this will work perfectly for me if and only if, user browse my site in IE. then what about Mozilla?
Here my problem starts…
Problem:
1. The DefaultButton property of the Panel/Form won’t work in Mozilla when you hit enter key, when all these panels are reside in UpdatePanel control.
2. When any one press Enter button, it should check for validation if any input controls are there in that panel.
Solution:
After a long research i wrote small java script function to solve the problem.
OnClientClick=”setDefaultButton(this.name);
Java script function:
function setDefaultButton(name)
{
Page_ClientValidate();
if(document.all)
{}
else
{
if(!Page_IsValid)
__doPostBack(name,"");
}
}
if you have any issues with my code, please let me know. Read More...

Role of ItemMetadata.xml in Infopath forms [SharePoint workflow].

Introduction:

ItemMetadata.xml is the file used to store/transfer the data from one Infopath to another. The name of the file is case - sensitive.

For example, in a sharepoint workflow, there are 10 steps, in each step we are using infopath and we want send some information from one to the next infopath form, the information won't save any where. we need to tell explicitly to infopath to store in xml file. that is nothing but ItemMetadata.xml.

It acts as the secondary data source to the infopath.

To compensate for the possibility of schema differences between the task data and the form it only stores the schema instead of data.

Example schema is

<z:row xmlns:z="#RowsetSchema" ows_fieldname=""/>

Note: Every field name should be prefixed with ows

In programming, we can retrieve the data from info path using the property called ExtendedProperties.

afterProperties.ExtendedProperties["filedName"].ToString(); Read More...

Conditional Statement example in SharePoint workflow.

In SharePoint workflow, when we are using the if/else activities, the if/ else methods contains Event argument called ConditionalEventArgs.
if you set e.Result=true, then if block will executes,
if you set e.Result=false, then else block will executes.

private void IfMethod(object sender, ConditionalEventArgs e)

{

if (isTrue)

e.Result = true;

else

e.Result = false;

}

private void ElseMethod(object sender, ConditionalEventArgs e)

{

if (!isTrue)

e.Result = true;

else

e.Result = false;

}

depending on the boolean variable “isTrue”, the if/else blocks will executes…
In the above case, i am setting “isTrue” variable depending on my logic, before calling if/else statements.
For while activity also, same logic applies.
if you set e.Result=true, it will iterate until you set e.Result=false.
Read More...

To access html control without runat=”server” in C# code.

In ASP.NET coding, I don't think it is needed to create or declare only ASP.NET server-side controls. Some cases, to make our page more efficient and faster we can write HTML controls by adding runat="server" to access them on server side code [C#].

But, there are some special requirements where we need to create HTML controls dynamically in c# and add them in a string and write the string to a page. And whenever some event raised like button click event, on server side code, we need to retrieve the values of those HTML controls. As it is not declared as runat="server" on the page, we can't take the values very simple. In that type of scenarios, this solution works. Please follow the solution below to get the values of the HTML controls which doesn't have runat="server" attribute defined.

Example:
HTML declaration:
<input type="text" name="txtName" />

C# Code:
string strValue = Page.Request.Form["name of the control"].ToString();

Note:
To get the values in server side code of HTML control, we need to follow below points.
  • The tag should have an attribute called NAME. Because it is used as key in form[].
  • The form method should be of type POST.
That's it!!! Please let me know, if you have any issues with this approach. Hope this helps...
Read More...

How to give anonymous access to the reports.

Give anonymous access to the Application.

1. Go to IIS.

2. Select reports application –> properties –> Directory Security –> Authentication and Access control –> Edit –> Enable Anonymous access

Give anonymous access to a report.

1. Open the reports on the report server located at http://<servername>/reports

2. Browse to the report that needs to be accessed anonymously.

3. Select properties tab for the report.

4. Select security from the left navigation.

5. Select New Role Assignment from this page.

a. Give a group name - everyone (in the “group or user name” text box).

b. Select the option Browser role and then click OK.

[Browser role is for READ - ONLY access]. Read More...

Creating new instance of SharePoint workflow through C# code.

In SharePoint 2007, very good feature I like much is that framework supports for writing custom events, features, workflows or any other stuff... We can customize complete SharePoint system at any context.
While working with SharePoint custom workflows, we get some scenarios where we need to stop a workflow or start a new instance of workflow through coding. So, I did some research on the MSDN library and saw all the dll's by doing some reflection on the libraries and found some functions and classes which supports this.

Here, I want to present a small code snippet which does terminating currently running wokflow and start a new instance of the workflow on the same list item.
SPListItem listItem = workflowProperties.Item;
SPList spList = workflowProperties.List;
string initData = workflowProperties.InitiationData;

const string WF_GUID = “The GUID of workflow template”;
workflowProperties.Site.WorkflowManager.RemoveWorkflowFromListItem(workflowProperties.Workflow);
workflowProperties.Site.WorkflowManager.Dispose();

SPWorkflowAssociation associationTemplate= spList.WorkflowAssociations.GetAssociationByBaseID(new Guid(WF_GUID));

workflowProperties.Site.WorkflowManager.StartWorkflow(listItem, associationTemplate, initData);

Hope this will give you a good start to code on SharePoint workflow start and terminate through coding. Let me know, what you think.

Read More...

How to display Site pages/ Subsites in navigation using Treeview control in sharepoint applicartion.

For very big projects in SharePoint or any other platform it's very difficult to organize the navigation to keep all the data on page. For example, you have plenty of data around 15 to 20 subsites and 50 pages, then it's impossible to display all the links at one place. So, it's big challenge that even you have more data on site but if user failed to find what he want on the site then it's just waste. So by thinking that for a long time, I came up with a good solution and here I am presenting for you with code.

What is my goal?
I want to show all the sites and pages in a SharePoint site in navigation area [lefft navigation] using asp tree view control in SharePoint.

Below is the complete code of the treeview functionality
In my case, we have 3 levels of data. So used tree view that has 3 levels.

ASPX Code:
Treeview control declaration and it's tree nodes 3 levels. We are writing some server side code to get and render the data as we want using different conditions. So for this purpose I am using prerender event of the tree view control.

<asp:TreeView ID="treeviewLeftNav" runat="server"
Width="191px" HoverNodeStyle-CssClass="leftNavHover"
DataSourceID="SiteMapDS" SelectedNodeStyle-CssClass="leftNavSelected"
OnPreRender="ControlTreeView_OnPreRender"
<LevelStyles>
<asp:TreeNodeStyle CssClass="leftNav1"></asp:TreeNodeStyle>
<asp:TreeNodeStyle CssClass="leftNav2"></asp:TreeNodeStyle>
<asp:TreeNodeStyle CssClass="leftNav3"></asp:TreeNodeStyle>
</LevelStyles>
</asp:TreeView>

We are using publishing site with all features enabled. So we are using PublishingNavigation control with PortalSiteMapDataSource to pull all pages and sites from SharePoint site.

<PublishingNavigation:PortalSiteMapDataSource ID="SiteMapDS" Runat="server"
SiteMapProvider="CurrentNavSiteMapProvider" EnableViewState="true"
StartingNodeOffset="0" ShowStartingNode="False"
TrimNonCurrentTypes="Heading"/>

In c#, I wrote below code for exact functionality of Expand/collapse TreeView.
C# code:
void ControlTreeView_OnPreRender(object sender, EventArgs e)
{
foreach(TreeNode n in treeviewLeftNav.Nodes)
{
if(n.NavigateUrl == Request.Url.AbsolutePath)
n.Expand();
else
{
if(treeviewLeftNav.SelectedNode!=null)
{
if(n!=treeviewLeftNav.SelectedNode.Parent)
{
if(n.ChildNodes.Count>0)
n.Collapse();
}
}
else
{
treeviewLeftNav.CollapseAll();
break;
}
}
RenderTreeNodes(n);
}

if(treeviewLeftNav.SelectedNode!=null)
treeviewLeftNav.SelectedNode.Expand();
}

void RenderTreeNodes(TreeNode node)
{
if(node!=null)
{
if(node.ChildNodes.Count>0)
{
foreach(TreeNode n in node.ChildNodes)
{
if(n.NavigateUrl == Request.Url.AbsolutePath)
{
n.Expand();
}
else
{
if(n!=treeviewLeftNav.SelectedNode.Parent)
n.Collapse();
else
n.Parent.Expand();
}
}
}
}
}
Note: please change your web.config of the site to allow server side code in pages where you are writing server side code. Read More...

Ajax,Master page problem in Sharepoint 2007 [MOSS]

Problem:
Ajax is not working as expected in SharePoint 2007.

Solution:
1. First we need to install Ajax extensions on the SharePoint server.
2. After installed Ajax, We need to add the corresponding entries in web.config file to know the web application that the script manage, ajax requests etc... For this we need to change the configuration.
3. Copy all the ajax dlls to GAC and then follow the instructions below.

There are plenty of problems while integrating Ajax into SharePoint when i was very new to SharePoint. I solved those problems after a long research on Masterpage and Java scrpt.

Here are the steps i followed.
Before change any of the file as mentioned below please take the backup of every file and save it in some safe location.

When i was new to learn SharePoint, i implemented this. I am not completly sure that we need to change the authorizedTypes Tag in web.config of my site ot not, but it works.
The changed section looks like below.

Web.Config changes

<authorizedTypes>
<authorizedType Assembly="System.Workflow.Activities, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" Namespace="System.Workflow.*" TypeName="*" Authorized="True" />
<authorizedType Assembly="System.Workflow.ComponentModel, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" Namespace="System.Workflow.*" TypeName="*" Authorized="True" />
<authorizedType Assembly="System.Workflow.Runtime, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" Namespace="System.Workflow.*" TypeName="*" Authorized="True" />
<authorizedType Assembly="System.Transactions, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" Namespace="System*" TypeName="*" Authorized="True" />
<authorizedType Assembly="System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" Namespace="System*" TypeName="*" Authorized="True" />
<authorizedType Assembly="Microsoft.SharePoint, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" Namespace="Microsoft.SharePoint.Workflow" TypeName="SPWorkflowActivationProperties" Authorized="True" />
<authorizedType Assembly="Microsoft.SharePoint, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" Namespace="Microsoft.SharePoint.Workflow" TypeName="SPWorkflowTaskProperties" Authorized="True" />
<authorizedType Assembly="Microsoft.SharePoint, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" Namespace="Microsoft.SharePoint.Workflow" TypeName="SPWorkflowHistoryEventType" Authorized="True" />
<authorizedType Assembly="Microsoft.SharePoint.WorkflowActions, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" Namespace="Microsoft.SharePoint.WorkflowActions" TypeName="*" Authorized="True" />
<authorizedType Assembly="System.Workflow.Activities, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" Namespace="System.Workflow.*" TypeName="*" Authorized="True" />
<authorizedType Assembly="System.Workflow.ComponentModel, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" Namespace="System.Workflow.*" TypeName="*" Authorized="True" />
<authorizedType Assembly="Microsoft.SharePoint, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" Namespace="Microsoft.SharePoint.Workflow" TypeName="SPWorkflowActivationProperties" Authorized="True" />
<authorizedType Assembly="Microsoft.SharePoint, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" Namespace="Microsoft.SharePoint.Workflow" TypeName="SPWorkflowTaskProperties" Authorized="True" />
<authorizedType Assembly="Microsoft.SharePoint, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" Namespace="Microsoft.SharePoint.Workflow" TypeName="SPWorkflowHistoryEventType" Authorized="True" />
<authorizedType Assembly="Microsoft.SharePoint.WorkflowActions, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" Namespace="Microsoft.SharePoint.WorkflowActions" TypeName="*" Authorized="True" />
<authorizedType Assembly="mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" Namespace="System" TypeName="Guid" Authorized="True" />
<authorizedType Assembly="mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" Namespace="System" TypeName="DateTime" Authorized="True" />
<authorizedType Assembly="mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" Namespace="System" TypeName="Boolean" Authorized="True" />
<authorizedType Assembly="mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" Namespace="System" TypeName="Double" Authorized="True" />
<authorizedType Assembly="mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" Namespace="System" TypeName="String" Authorized="True" />
<authorizedType Assembly="mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" Namespace="System.Collections" TypeName="Hashtable" Authorized="True" />
<authorizedType Assembly="mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" Namespace="System.Collections" TypeName="ArrayList" Authorized="True" />
<authorizedType Assembly="mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" Namespace="System.Diagnostics" TypeName="DebuggableAttribute" Authorized="True" />
<authorizedType Assembly="mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" Namespace="System.Runtime.CompilerServices" TypeName="CompilationRelaxationsAttribute" Authorized="True" />
<authorizedType Assembly="mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" Namespace="System.Runtime.CompilerServices" TypeName="RuntimeCompatibilityAttribute" Authorized="True" />
<authorizedType Assembly="mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" Namespace="System" TypeName="Int32" Authorized="True" />
<authorizedType Assembly="mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" Namespace="System" TypeName="TimeSpan" Authorized="True" />
<authorizedType Assembly="mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" Namespace="System.Collections.ObjectModel" TypeName="Collection`1" Authorized="True" />
</authorizedTypes>

Master page changes:
Second, i removed the onload function from BODY tag. This was causing problem, i am not sure that whether we need to remove or not!!! But it's working as expected.

I removed following section from BODY of master page.
onload="javascript:if (typeof(_spBodyOnLoadWrapper) != 'undefined') _spBodyOnLoadWrapper();"

JAVASCRIPT CHANGES:
Third, Go to init.js file in 12\TEMPLATE\LAYOUTS\1033 folder.
find the method spFormOnSubmitWrapper()
in this function, just remove the line return false; in
if (_spFormOnSubmitCalled) statement;
this is because, if that page is already posted back, sharepoint won't post it back again.
but to work AJAX calls, we need to disable it. Because Ajax sends the XmlHttpRequests many times depends on user requests, SharePoint ignore all the calls except the very first call. This is the reason when very first time you integrate Ajax into SharePoint the first call always successful and from the second call no response.
Hope it helps. Read More...

MOSS My Calendar web part login problem for different accounts.

Introduction
When i work with my calendar web part in SharePoint i got lot of questions and things happend. I configured every thing correctly in My Calendar web part. i.e. Mail server address, Mailbox.
But, when i open the page which contains calendar web part, it prompts for user name and password of SharePoint site and another for the Calendar login . i.e. login page of OWA application. If i provide the credentials it will work very fine.

Now the problem starts. If the same application is trying to access by another users, other than me then they always failed to login to my calendar web part. This is because, in mailbox i given my mail account, when others try to access the web part, it won't work at all. What they need at that time is they will edit the web part, and will change the mail box entry to their own email address. But this is not possible for all users because all users don't have access to edit page/web part.

My Calendar web part is not working for multiple users in SharePoint 2007. When you add the web part on a page, which mail entry you given in the Mail box that account only can login. I will tell you the reason behind it. My calender web part is developed only for the purpose of showing their calendar events on the my site page. My site usually have access to edit/delete etc for current logged in user, so he will see the content belongs to him. But as per my requirements i want to use the calendar on the site other than the my site pages.

Solution
To solve the above mentioned problem, i created a custom web part to work for any logged in user. You can get this code and build it, add the appsetting variables in the web.cofig and deploy it to your site, You can get the deploy instructions of web part in MSDN.

Web.Config changes:
We need to provide 2 inputs to the web part.
1. Site Url.
2. Exchange Server Url.

Requirements:
We have setup a job [Usually we will setup while installing SharePoint and configuring the SSP] in your SharePoint server Shared services Provider [SSP], to pull user accounts from the Active Directory to Sharepoint profile database. Because below code gets the emailid automatically from the user profile database depending on the logged in user. And places that emailid in Mail Box entry of the my calendar web part.

Usually, when we configure the Profile import, we will give source and configure the schedule too. i.e. Every day night or every week etc… it pulls data from given source [ex: Active directory] to profile database.

using System;
using System.Collections.Generic;
using System.Text;
using Microsoft.SharePoint.Portal.WebControls;
using Microsoft.SharePoint.Portal.Topology;
using Microsoft.SharePoint.Portal;
using Microsoft.SharePoint.Portal.UserProfiles;
using System.Runtime.InteropServices;
using System.Web.UI;
using System.Web.UI.WebControls;
using System.Web.UI.HtmlControls;
using System.Configuration;namespace MyCalendarWebPart
{
[Guid("15241046-2128-4a1d-b6d5-24c8a67c4d28")]
public class MyCalendarWebPart : Microsoft.SharePoint.WebPartPages.WebPart
{
private OWACalendarPart wpCalendar;
HtmlGenericControl litMsg = null;
HtmlGenericControl roundedCorner;

protected override void CreateChildControls()
{
wpCalendar = new OWACalendarPart();
litMsg = new HtmlGenericControl();
roundedCorner = new HtmlGenericControl();
roundedCorner.InnerHtml = “”;

Controls.Add(configureCalendar());
Controls.Add(litMsg);
Controls.Add(roundedCorner);

base.CreateChildControls();
}

private OWACalendarPart configureCalendar()
{
try
{
//Connect to the portal and get the portal context.
TopologyManager topology = new TopologyManager();
PortalSite portal = topology.PortalSites[new Uri(ConfigurationManager.AppSettings["SiteUrl"])];
PortalContext context = PortalApplication.GetContext(portal);

//initialize user profile config manager object
UserProfileManager profileManager = new UserProfileManager(context);
UserProfile profile = profileManager.GetUserProfile(true);

wpCalendar.Title = “My Calendar”;
wpCalendar.ViewName = “Weekly”;
wpCalendar.CssClass = “”;

// use the profile object to retrieve the properties you need in your company to
// retrieve the mail box name
string workmail = profile[PropertyConstants.WorkEmail].ToString();

wpCalendar.MailboxName = workmail;
wpCalendar.OWAServerAddressRoot = ConfigurationManager.AppSettings["ExchangeServerUrl"];

wpCalendar.Height = “655″;
wpCalendar.Width = “600″;
wpCalendar.ImportErrorMessage = “No EmailID found for your account.”;
}
catch
{
litMsg.InnerHtml = “No EmailID found for your account.“;
}

return wpCalendar;
}

protected override void RenderWebPart(HtmlTextWriter output)
{
try
{
wpCalendar.RenderControl(output);
litMsg.RenderControl(output);
roundedCorner.RenderControl(output);
}
catch (Exception ex)
{
output.Write(ex.ToString());
}
}

public override void RenderControl(HtmlTextWriter writer)
{
base.RenderControl(writer);
}
}
}

Note: In above code i am pulling site url and exchange server url from the appsettings of web.config of the site. Read More...

Magic of Page Load and Page Init execution.

Problem:

Not completely understood the Page Life Cycle in ASP.NET.

This is very helpful in big projects, that we need to initialize some object in Master page, page load and try to access the same object in aspx page load, it will give error [Because aspx page load will execure first.]. In this type of problems you faced, please write the same code in init of the master and aspx pages.
Scenario I,
If we have a page called "Default" and a Master page called "DefaultMaster".
Both contains Page_Load events.
When a page request comes to Default page, which Page Load event will call first?
Answer: default.aspx page load.
Scenario II,
The same pages as above with Page_Init event is also present.
Now which Page_Init will call first by .net runtime?
Answer: Master page init event, but not default.aspx page init event.
This is the good point to know that,
All page Init events will process from master page to aspx pages and all page Load events will process from aspx pages to master pages.

Page Init EventPage Load event
1. Execution sequence starts from Init event of Main master page--> Sub Master Page --> ……. -->Requested ASPX page init event.1. Execution sequence starts from Load event of Requested ASPX page --> Sub Master Page --> ……. --> Main master page load event.


Whole Page Life Cycle which contains both events
Master page Init --> Sub Master Page Init --> …. --> ASPX Init --> ASPX Load--> …. --> Sub Master Page Load --> Master page Load. Read More...

Show Desktop icon in Quick launch area in Windows server 2003.

I know most of the people are looking for it and by default we don't see the desktop icon in Windows server 2003 or XP..... So, I too faced similar problem when I was new to this IT field. It's simple and easy to implement.

Here are the steps to follow to add it to quick launch area.
  • Open notepad.
  • Copy the below code.
[Shell]
Command=2
IconFile=explorer.exe,3
[Taskbar]
Command=ToggleDesktop

  • save the file with the name ShowDesktop.scf some where on your system or on your desktop.
  • drag and drop the file to the Quick launch area.
That’s it!!! you are done. Now, you got what you need. enjoy these wonderful posts!!!
Is this helpful? Read More...

How to give specific permissions to a SharePoint list item through C# to a user

SharePoint 2007 allows item level permissions. The is the big change in security trimming from SharePoint 2003. It's really very good and useful in many scenarios like when an item is belongs to others or task assigned to some other person then users can't see other's tasks or they can't edit other's tasks etc.. item level permissions are really good and useful.

Here is a small code, to set specific permissions to a user for a list item.
The below is the exaple from my SharePoint custom workflow to customize the task list.
System.Collections.Specialized.HybridDictionary taskPermissions = new System.Collections.Specialized.HybridDictionary();
taskPermissions[workflowProperties.Originator] = SPRoleType.Administrator;
taskPermissions[taskApprover] = SPRoleType.Administrator;
taskitem.SpecialPermissions = taskPermissions;

taskitem is the task list item. Hope this helps..... Read More...

How to check SMTP is working or not.

For testing SMTP server functionality, we have to follow some steps to identify that whether SMTP configured or not. Below are the steps we need to follow to check SMTP server functionality. We are using the command prompt telnet to test the server.

1. open command prompt and type :
telnet <servername> 25
Note: 25 is the port used by SMTP and <servername> is the SMTP server name.
After you hit enter you will get some output like
220 <servername> Microsoft ESMTP MAIL Service, Version: 6.0.3790.3959 ready at
Tue, 22 Jan 2008 09:10:27 -0600
It means you got response from the SMTP server and it's the clue that SMTP is setup on the server.

2. For testing response say helo to it.
Type :
helo <servername>
output:
250 <servername> Hello [IP Adress]

3. Now we need to enter the From address of the mail.
Type :
mail from: admin@domain.com
output:
250 2.1.0 admin@domain.com….Sender OK

4. It's time to enter the recepient email address.
Type : rcpt to: someID@domain.com
output:
250 2.1.5 someID@domain.com

5. Now we are left with the data of the email. i.e. subject and body.
Type : data
output:
354 Start mail input; end with <CRLF>.<CRLF>

6. Type:
subject: this is a test mail
Hi
This is test mail body
I am testing SMTP server.

7. Hit Enter, then . and then Enter.
output:
250 2.6.0 <<servername>C8wSA00000006@<servername>> Queued mail for delivery

8. Type: quit
output:
221 2.0.0 <servername> Service closing transmission channel

If you did everything as explained, you will get a mail soon. Read More...

Importance of !important property in CSS.

I think in css most of the people don't have idea or complete idea on keyword called "!important".

It will be useful in many scenarios where we have plenty of style sheets and want to overwrite all the styles applied to an element with some other styles. We know that applying styles in HTML for controls hierarchy will be like this, Which styles are very close to the element those will apply finally. i.e. if we apply some styles for a division in css and if we write inline style for the same element, then always inline styles apply. That means the inline styles overwrites all other styles to that element. Like the same, in our css, if we apply different styles to an element in different places and finally if you want to overwrite the styles in a perticular section then you need to use this !important property at end of the style.

We have a scenario like where we need to overwrite all of the styles to a perticular control, there this property will help us.

Ex: .topNav

{

background-color:#990000;

}

.topNav {some other style….}

.topNav

{background-color:#818286 !important;

}

the last css declaration will overwrites all the topnav classes declared and finally it applies the background color of #818286 for top navigation.

Read More...

How to know the number of days in a month in t-sql

Some times we can get requirement to know the number of days in a given month. For example, while calculating how many days remaining in a month or last day of month or any other scenarios.
Here is a small script that will give you the number of days for given month. Example purpose, i am retrieving month from the current date.

declare @numberOfDaysInMonth int;

set @numberOfDaysInMonth = DAY(DATEADD (m, 1, DATEADD (d, 1 - DAY(getdate()), getdate())) - 1);

Read More...

How to know month name from month index in t-sql

When I am working with reports, this is what every time I am looking for. Everyone gives date on the report as input, and I want display the month name on the report instead of month index [1-12]. So, how to do it?

please use the below script to get the name of the month by month index.
DateName(month, convert( datetime, '2007-' + cast(month(getdate()) as varchar(2)) + '-01', 120))

Here month(getdate()) returns the current month index from getdate() function. Hope this helps... Read More...

Expand Collapse columns in a table SSRS

Problem:

How to make columns collapsible, expandable in table Sql server reporting services.

Solution:

I asked to some people, how to do this, lot of people proposed me to use matrix. But my data is simple data and has 14 columns like firstname, last name, state, city, zipcode, q1, q2, q3, h1,h2, h3 etc…

I need to display the columns q1, q2, q3; h1,h2,h3 expansible and collapsible. By default when user see the report, i need to show only firstname, last name, state, city, zipcode, q1, h1 columns. But the q1, h1 columns with expand, collapse symbols.

When any one clicked to expand i need to show q2,q3, and same for h2, h3.

Here are the steps i followed to make it work.

No Matrix, no other,. i am done with tables only.
Just a small trick

1. Select all the columns that you want to make expandable, collapsible.
2. Hit F4 [Properties window] –> Select Visibility, and in that, set these values, Hidden = true and ToggleItem = ID of the textbox [Where you want to show the Expand[+], Collpse[-] symbols that text box id. in other words parent text box id.]
3. Done, Here we go.

Read More...

Maddie-isms

Maddie: What are you doing?

Me: Feeding Bella some mama's milk.

Maddie: Can I have some milk?

Me: Errrrr.....NO.

Maddie: NOT THAT MILK!

********************************

Maddie: I really want those high heeled flip flops

Me: Your mama says you are too young for them. No.

Maddie: But I'll keep them at your house.

Me: It doesn't work that way, Missy. If Mama says no then you can't have them.

Maddie: But they are so cute!

Me: Yes, they are. Maybe when you are 10 you'll be old enough for high heeled flop flops.

Maddie 10! I'm only seven.

Me: I know.

(insert about 15 more "buts" in here)

Maddie: BUT... Nevermind. "But" doesn't work with you.

Me: HA HA HA. You're right. "but" doesn't work with me. Read More...

Homeless

Well, we've been homeless now for almost 3 weeks. I'm lucky that my hubby actively scouts out the "perfect" homeless location for us under the next bridge, tree, or behind that tall rock. I'm so glad we have family and friends that love us enough to put us up for the night so I don't have to endure the "tinfoil" dinner that I'm sure John dreams about cooking for his homeless family.

TRAVEL LOG
That being said...we drove from Sierra Vista to Phoenix...spent easter morning in Phoenix and then off to LA. In LA we saw Tracy and Dasia who so kindly kept our gigantic bags for us so we only had to pack what was needed for Portland. We spent about 10 days in Portland (four at the magical wonderful land of the beach.)...and then flew back to LA...picked up our bags...And flew the next AM to DC.

We were a little stressed that our LAX to DC flight was going to leave without us so as we were heading toward the security check point we rounded the corner and saw the elevator doors closing. Like the skilled ninja mama that I am I aimed and pushed the speedy stroller to catch the doors...let go and off flew our Graco stroller right into the door of the elevator...jammed in between the door and the wall...the elevator doors opened back up...we had a good laugh when we caught up to it and saw a poor old man who had a heartattack when the stroller went flying into the elevator alone...whoops....:) Oh...I should probably mention that bella was not IN the stroller. Sheesh...wouldn't that give me mom of the year award?

Bella was a rockstar on the airplane...singing like the lead singer of a scream rock band. Music for my ears and loud enough for the whole plane to enjoy. I loved watching our seat in front neighbors pull out the earplugs. ha ha ha...its a good thing she is so cute.

Now we're in DC where I have enjoyed our time with the day family and I got to see two great friends last night where sissy Jenna once again OUTDID herself in the BBQ foodfest department. I have consumed about 7 rice krispie treats in the past 2 days and gained 14 pounds. I love it. Bella and Savannah cooed and kicked at each other like they've known each other their whole lives.

Tomorrow we fly off into the wild blue yonder to Frankfurt. I'm sooooo glad the flight is overnight so baby girl will sleep (fingers crossed...)

Germany...here we come! Read More...

Framingham Heart Study has lessons for Twitter

In 1948, the National Heart Institute embarked on an ambitious study of the causes of heart disease. By studying 5,209 men and women between the ages of 30 and 62, who lived in the town of Framingham, Massachussetts, and who had not yet developed overt symptoms of heart disease, researchers were able to learn about the underlying causes and symptoms of one of the leading causes of death and serious illness.

The Framingham study was groundbreaking in its contribution of knowledge about heart disease, and has become the foundation of medical diagnosis, practice, and treatment in the area throughout North America and Europe. It has led to over 1200 articles in medical journals in the past half century.

However, in recent years, there have been questions of its ability to correctly model the risk factors on certain patients, especially those of South Asian and African descent. The population of Framingham, Mass, when the study was conducted, was predominantly caucasian, and thus the guidelines it produced have been cause for question in today's multi-cultural world.

Quite simply, as populations change, rules and guidelines built before the change may need to be re-examined for their applicability after the change.

So, what does this have to do with Twitter? Twitter too, has guidelines, practices, and approaches that work. The vanguard of Twitter, folks like @guykawasaki, @chrisbrogan, @armano, and @conversationage have communicated these guidelines and practices, and in doing so have made Twitter into a very effective and very popular tool for communication.

But in becoming so effective and popular, the population on
Twitter has fundamentally changed.


Now, Ashton Kutcher, Britney Spears, and Oprah are avid Twitterers, colleagues who had not heard of Twitter 6 months ago are joining to "see what it's about", every politician is borrowing a page from Obama and joining the fray, and radio stations seem to mention their Twitter handle more frequently than their web address. @MackCollier has already recently written about the changing population.

Plenty of people are debating whether this is a good thing or a bad thing. I think that debate is moot. The population of Twitter has changed and is changing, and the change will continue to be interesting.

The real question is, much like the Framingham Heart Study, does a change in the population on Twitter mean a change in the guidelines?

It's difficult to believe that Ashton and Britney are following the "rules" that were tacitly understood prior to their arrival, and in fact I don't think it would make sense for them to. They bring a different persona to the population, and in doing so, change the landscape a small amount.

I don't profess to know how the guidelines of Twitter will evolve with a new, larger, more mainstream population in place, but I do have some questions:

- will it be accepted for celebrity Twitterers to be "outbound" only, rather than interactive?
- will we accept that many Twittering celebrities will be ghost written? Or will we insist on authenticity?
- will Twitter become the defacto standard for news distribution for outlets like radio stations and news publishers?
- how does our organization of Twitter communication evolve beyond Tweetdeck filters and the like when there are "updates" from celebrities or news outlets and "communication" from individuals to manage?
- what measures of influence will evolve to indicate the most influential writers?


What are your thoughts? How will the norms of communication on Twitter look 12 months from now, as the population continues to change?

Read More...

National Instruments: Multiple Activities Leading to Multiple Responses

National Instruments has done a great job of creating an information rich web presence that provided relevant and useful information to their audience of scientists and engineers. They had implemented a very elegant equitable exchange of information process that asked for small amounts of information from their audience in exchange for access to the information resources, and were nurturing their prospects based on the information they had requested.

The challenge that Helena Lewis and the team at National Instruments needed to tackle though was what happened in a prospective buyer was very active on their site and accessed multiple information resources in different areas. Here is a case study on how they tackled the challenge, from Digital Body Language:


National Instruments: Multiple Activities Leading to Multiple Responses


National Instruments leveraged the rich information on prospects’ interests that it gleaned from prospect digital body language on its Web site to deliver highly targeted and relevant communications. The success of these communications was evident in the very high open and clickthrough rates discussed earlier. To achieve this, however, National Instruments had to overcome an operational challenge.

The mapping of online activities to communications was straightforward, but also created a challenge. What should happen if a site visitor performs multiple actions that warrant a communication? For instance, downloading four whitepapers should not result in four communications.

To ensure that prospects are not inundated if they perform a number of triggering activities, National Instruments built a waiting period of 24 hours into its scoring. If multiple actions were seen in a 24-hour period, the actions were scored individually and the most relevant communication was selected. Similarly, if an action had been performed before (for example, downloading an automated test guide), the prospect was not sent communications that had this as a call to action.

This solved the challenge of too many communications, but National Instruments also realized that certain key actions should bypass this logic. For example, if a visitor abandons a shopping cart, or saves the configuration of a product, a communication would be immediately triggered. The 24-hour delay was reserved for communications that were deemed less critical.

Since National Instruments is a global organization, each time it learned a better way to interact with customers and built processes for doing so, it replicated that logic and structure and separated it from the content. In this manner, it only needed to translate content and messaging to roll out its program to any of 35 countries.
Read More...

Why the Contact Washing Machine must be In-House

As B2B marketers, we all deal with the same reality; we receive a continuous stream of dirty data, but yet realize that success requires us to work with clean data. I wrote some time ago about the Contact Washing Machine concept, a data cleanliness program that standardizes and normalizes data within a B2B marketing platform.

Since that time, however, I've had a lot of conversations where marketers have suggested that they don't need to establish the discipline of the contact washing machine to keep their data clean, as they have a service (either a tool or an agency service) that they send their marketing data to once in a while to have it cleansed.

The challenge with this approach is that the data we are dealing with is continually being used, and continually being added to and edited. Every source of data that flows into your marketing data base has an opportunity to dirty the data that is within it. Whereas you may be able to control some sources of data, many you cannot, and if a data source is contributing dirty data, very quickly your marketing database will soon have a percentage of data that cannot be relied upon.

The sources of data to most of today's B2B marketing platforms are quite varied, and many of them either are not, or cannot be, rigorously controlled in terms of the data they pass into your platform.

CRM systems and data marts will have their own data rules and data standards, and may contribute data of varying qualities. Event registrations may be provided to you by third party vendors and thus have non-standard ways of collecting data. Web forms may allow free-form text fields for titles, industries, or address fields, and in doing so contribute dirty data. List uploads, whether purchased, or from legacy data stores are often of dubious quality, and any integration with your sales team's desktop email environment means that they will be contributing data as they chose to type it.

In short, the variety and breadth of data sources in most marketing environments means that we cannot control what data is coming in, and thus, our data quality begins to degrade as soon as we have finished a bulk data cleansing process.

The only viable solution, and the concept behind the Contact Washing Machine, is to bring it in house. In order to have clean data in your marketing platform, you must have your data being continually cleansed at each touch point. Each time that data is sourced or changed, it should be put through the cleansing process of the Contact Washing Machine in order to remain clean and standardized.

Whereas bulk data cleansing can offer additional cleaning to what the automated standardizing, cleansing, deduping, and normalizing that a Contact Washing Machine can provide, both are necessary. If you think of clean data as being like the oil that keeps your marketing engine running smoothly, you can think of bulk data cleansing as being like an oil change, and the Contact Washing Machine as being like an oil filter within the engine. An oil change may do more to clean the oil, but you won't get far from the mechanic's garage if your engine does not have an oil filter to keep things continually cleansed.
Read More...

Strategy and Tactics in B2B Marketing with Social Media

The world of B2B marketing is changing. A lot.

Many of us have been in conversations where we discuss the tactics we are using, and how they add up to an overall marketing strategy. I've been in a lot of these discussions too, and the general definition of what makes a marketing strategy leads to an interesting discussion.

The question is whether a strategy needs a defined outcome (an exact destination) or a general outcome (a general direcational goal) as it's intended target.

In my view, restricting strategic thinking to a defined outcome, instead of just a directional outcome leaves it almost useless for uncertain areas like social media. I would be rather suspect of any marketer who felt comfortable setting out exactly what their 5 year (or even 2 year) goals were with their social media strategy, along with tactical milestones along the way. The honest truth is we don't know. None of us do.

However, that does not leave us strategically hamstrung. If you look the the strategy and tactics for a participant in the American migration in the mid 1800s, you see a similar pattern.

Strategy:
  • Go West, Young Man

Tactics:
  • Get covered wagon
  • Drive to western edge of town
  • Keep going west
  • Avoid any disasters that arise on route
  • Repeat steps 3 and 4
  • You'll know when you end up where you want to be
The reality is you are getting into unknown territory, and as such, it is not possible to define a strategy down to an exact level of detail with a specific end goal in mind.

Today's B2B marketing world, especially as we engage with social media, has a similar challenge. If we attempt to define exactly what the outcomes are of any specific initiative, we end up unable to even start. Directionally, we do know where we want to go:
  • Better engagement with customers
  • Better understanding of their needs
  • Better awareness of our message in the market
However, it is not possible to always accurately map our tactics to these directional goals. Sometimes we just have to set a directional strategy and execute against it as well as we can.

Is this how you're thinking of your B2B marketing strategy in social media? Here's what we're doing at Eloqua which is working well in terms of engagement, understanding, and awareness, but as a general direction, not an exact destination strategy. Have you been able to get your executive team comfortable with a directional strategy? Read More...

Synopsys: Centralized Marketing Communications

Large organizations whose marketing groups have grown organically over time can often have fairly disparate functions, content, and databases. To bring those distributed marketing efforts into a central marketing database can often be an interesting challenge. Mital Poddar at Synopsys walked me through how they had tackled that centralization effort when I chatted with her during the writing of Digital Body Language.

Using a combination of incentives, including tracking of digital body language, high quality branded templates, and management of opt-out requests, Synopsys was able to win over the distributed marketing teams and centralize over 100 independent marketing databases.


Synopsys: Centralized Marketing Communications

Synopsys is a world leader in software and IP for semiconductor design and manufacturing. As such, the sales process is very knowledge and education oriented. Sophisticated buyers and sophisticated sellers exchange lots of information throughout a lengthy process. Because of this, Synopsys discovered that each of their product marketing managers were sending micro campaigns to small lists of prospects at various stages of the education and sales process.

This had been a somewhat functional process, but did not allow Synopsys easy control of their brand and messaging and the risks of over-communicating to customers due to a lack of centralized control had become significant. It also did not allow Synopsys any insight into the digital body language of those prospects as each individual salesperson would send their small scale campaigns in their own way – often from their desktop – in a way that did not allow centralized tracking.

To operationalize these communications in a way that still allowed the knowledge-intensive sales process to progress, but gave Synopsys better control over brand and better ability to provide insights into their prospects’ digital body language, they decided to centralize the process. Each salesperson could send any communication that they desired to, to any list of their prospects, but it would be executed centrally by a marketing service bureau (of one individual). This enabled consistency in brand messaging and started the process of keeping historical campaign data in one location.

As with any organization, there were pockets of resistance to either the creative
standardization (all communications would now share a common look and feel) or to giving up control of a list of contacts. The centralization, however, offered three benefits that outweighed these hesitations. A common theme was more aesthetically pleasing than most of the individual efforts, winning over many. The reduction in effort was a second significant selling point. The ability to instantly see the results of each campaign, and the individuals who had clicked through and sought further information, was the final advantage to win over skeptics.

During two months, the transition was made to this new operational model. The field team was able to quickly craft the message and the target audience they had in mind, which was then passed to the central marketing service organization. By centralizing management of the final creative touches and the distribution of the messages, the marketing organization was able to maintain control over the branding and look & feel. The team was also able to ensure the proper tracking was in place to allow insight into the prospects’ digital body language.

Through centralizing these communications, the Synopsys team was able to gain control over their brand and the frequency with which they communicate with prospects, while at the same time building rapport with their sales organization. By adding in the ability to observe the customers’ digital body language, they also began to build a foundation for deeper insights into their audience, and for an internal culture of analytics.

Read More...

Our B2B Facebook Marketing Strategy


It's a work-hard, play-hard world out there, and there's nothing better to end a day of deep discussions on marketing than sharing a few drinks or going out to grab a bite to eat.


At Eloqua, we are lucky to have some of the most interesting, fun, and social customers around (they are marketers, after all), so when we started talking about what one might do with Facebook from a B2B marketing perspective, an obvious answer came up. Keep it friendly.


I suspect we're not the only marketing team who was on one hand intrigued by the reach and the buzz of Facebook, but on the other hand not quite sure what would work from a B2B marketing perspective. The idea of discussing the latest whitepaper or marketing best practices did not seem to fit the social vibe of Facebook. When the discussion turned to our customers though, we quickly realized that the social interactions we had with them were as interesting and valuable as the marketing discussions we had.


Another thing that we discussed was that many of our team will work with our clients remotely before ever meeting them face to face. Friendships develop over email and phone conversations far before having a face to face interaction.


So, our Facebook plan developed. We wanted to explore and expose the social side of Eloqua. As we go to events, share a few drinks with clients, or have dinner with marketing groups, we'll post the pictures on Facebook so the clients we are already friends with can get to know us better as people, and the people who have met at user groups around the country can reconnect online.


What is our business rationale for this? That's a great question. The Gallup organization did a workplace poll a few years back that observed that having good friends at work was a significant factor in employee retention (http://gmj.gallup.com/content/511/Item-10-Best-Friend-Work.aspx). Could the same idea hold in client retention? We're not sure, but it's fun to try, and a great reason to get more social with our clients.




Our commitment is that we won't "talk shop". Every marketer we have interacted with who experiences an Eloqua event and feels the vibe and energy of the community is excited to join it, and that is motivation enough for us. Will this strategy work, we're not sure, but as David Armano said the other day, we're all learning by doing - http://darmano.typepad.com/logic_emotion/2009/03/brands-will-learn-by-doing-get-used-to-it-.html




I'm interested in your comments, whether you're an Eloqua customer or not. What have you seen with this type of a B2B Facebook strategy? What has worked, what has not? What are your first reactions? If you are a customer, what are your thoughts on our Facebook presence? What would you like to see more of or differently about the social side of Eloqua? Read More...

VFA: Nurturing to Re-Engage Dead Leads

Almost any of us in B2B marketing wrestle with the challenges of the leaky funnel. As we pass leads to sales that are not ready, perhaps due to them being a "future likely" rather than a current opportunity, we end up with a dead lead pile.

Re-engaging with this dead lead pile can be one of your easiest ways to generate more active opportunities without significant additional expenditure. Amy Marks from VFA took this approach, and I had the pleasure of chatting with her about it when I was writing Digital Body Language:

VFA: Nurturing to Re-Engage Dead Leads

VFA, a leading provider of end-to-end solutions for facilities capital planning and asset management, had many dormant leads in its marketing database. Amassed over a period of years, through tradeshows, lists and sales activities, these “dead” leads were stored in VFA’s CRM system, had never been converted to opportunities, and were no longer receiving communications from VFA.

To engage these leads, VFA implemented a 5-part nurturing program that provided unique content to each of the 6 verticals targeted by VFA. The initial communications were case study focused, and progressed to white paper and webinar downloads, then offers to request a demo or engage directly with sales.

At each step of the process VFA’s marketing team enabled the prospect to engage in a way that was governed by the prospect’s own buying process. Each email offer connected to a landing page that described how similar organizations were able to meet their business challenges.

Additional resources—articles, case studies, white papers—were offered, allowing prospects to select the information they needed depending on their stage in the buying process. At each step, the prospect had the ability to “short cut” the process and jump straight to a later stage; by either requesting a demo or engaging with the VFA sales team.

The campaign succeeded in bringing back a tremendous number of leads from the “dead”. Over 120 highly qualified leads were passed to sales, and over $1.6M in pipeline was created. In a typical example, a lead may have been disqualified at a much earlier date, but given changes in the prospect’s organization, they were now ready to purchase and a sales opportunity was discovered. Only through nurturing and observing the buyers’ digital body language were these opportunities rediscovered.

Read More...

What the Prius can teach about B2B Marketing Analytics

Anyone who has driven in a Prius is aware that it is an interesting experience, different than in most other cars. However, can it really teach us about B2B Marketing Analytics?

I think the answer is yes.

If you look at the dashboard of the Prius, it has some interesting data. Mainly, it shows you your energy consumption, and your gas mileage in Miles Per Gallon. The data is right in front of you, and continuously updated.

It is interesting in that there are no compensation plans, MBOs, or commission structures associated with your gas mileage, but every Prius driver will report the same effect - you change your driving habits. Just by having the metrics in front of you, you feel compelled to do the behaviours that drive the number in the right direction; accelerating gently, not driving too fast, braking slowly.

This works in B2B marketing too. Socializing metrics can drive behaviour, regardless of compensation plans and MBO objectives associated with those metrics. If your marketing organization does not share, socialize, and discuss metrics on campaign response, you may be surprised at what happens if you do.

We all react naturally to seeing metrics in front of us on our behaviour, and much like in driving a Prius, if we see that our marketing campaigns are not influencing buying behaviour, are not driving inquiries, or are not guiding leads to become marketing qualified leads, we will likely stop those activities. Read More...

5 steps to a more honest view of buyer interests

I was talking with Stefan Tornquist the other day while we got ready for a video webcast with On24 (see the webcast here). We spent some time discussing one of the results from an interesting survey that MarketingSherpa did on buyer transparency.



Sherpa had asked the question "How often do you provide accurate information during registration?" of 2,700 technology buyers, in the context of events such as webinars and virtual events. The results were very interesting.

Data such as name and email were generally provided in an accurate manner, with around 70% of respondants always providing accurate information and another 20% sometimes providing it. However, beyond that, the prospects' likelihood of submitting accurate information plummetted. For Job Title, only 53% said they always submitted it accurately, and for company size it dropped further to only 40%. Although information such as readiness to buy, or main area of interest, was not studied in this survey, most marketers would intuitively suspect that it would be significantly less accurate than even Job Title and Company Size.



This challenge underscores the importance of observing what prospective buyers do, rather than just what they say, in understanding them as a buyer.



  1. Map your buyer's buying process and understand how each buyer progresses from education through to vendor discovery, validation, and purchase.
  2. Understand your marketing assets, and map each of your marketing assets into the "buyer's toolkit" so you can understand where each is applicable in the buying process.
  3. Define areas of your website that also map into this buyer's toolkit, allowing you to understand how web activity best maps to buying stage and area of interest.
  4. Add in search activity to give you an even more refined view of buyer interest and intent. Understanding what questions each prospective buyer is asking gives you a much more accurate view of their stage in the buying process.
  5. Present the information on each prospect's true area of interest, or stage in the buying process to your Sales team in the environment that they are most comfortable in - their CRM system.



There is no way to guarantee that your insights into buyers' roles, interests, and industries are accurate. However, if you look at their digital body language to see what they do, and what they show interest in, and use that information to augment what they fill out on web forms, you will have a clearer picture of their interests than through web forms alone. Read More...

Banned!

Ever seen that Seinfield episode with the soup Nazi that bans Seinfield from his soup for two weeks?

Yup...I've been banned from being an escort to the kids for their flight to Germany when they come to visit this summer by the biomom Nazi.

**side note** I think I can say "Biomom Nazi" with no judgemental tone since I am now a BIOMOM. So don't take that the wrong way all you women with children...

Apparently I've been labeled an "inappropriate" choice by Cruella and she is adamant that Maddie's mental health doctor will agree with her.

Not that I believe a word that comes out of her mouth as it is mostly self serving gibberish that sounds a lot like this, "ME ME ME I WANT ME WANT I AM IMPORTANT TO ME," every time I hear her open her mouth, but lets just pretend for a minute that her doctor really did say I was a bad choice.

WHY?

Other than the terrible travesty of injustices that I lay on these kids (think back to Eric in tears because I make him "play" and "brush his teeth" and he doesn't have to do that at mom's house) I'm guessing its because Cruella is threatened by my friendship with them and has displayed me in the worst light possible.

Well, we shall see how this summer vacation turns out...according to the court docs she doesn't get to choose, but she'll make it hard on us if we don't give in on this one...and frankly....15 hours on a plane with the kids really doesn't sound like a party to me so I'm not really crying over this edict. Read More...
Related Posts with Thumbnails
GiF Pictures, Images and Photos