Adding a list item to Document library through c# in SharePoint 2007

As we all know the required filed for document library is not Title. We should be upload a file for the document library list item. This is the required field by default. You can add columns and make them required as well. So, when through program we want to add a list item to the document library then we need to collect the uploaded file and the meta data i.e. the list of columns data. So on the custom form of the document library we need to add ASP.NET controls for file upload and for all the other columns. Here I thought of giving C# code we need to use to create a list item in SharePoint document library. I think this is the only efficient way of adding a list item to document library. If not please post your ideas.

string fileName = fileUpload.PostedFile.FileName;
using (SPSite site = new SPSite("http://sharepointserver"))
{
using (SPWeb web = site.OpenWeb("/"))
{
try
{
web.AllowUnsafeUpdates = true;
using (FileStream fs = File.Open(fileName, FileMode.Open))
{
SPList list = web.Lists["Documents"];
Hashtable metaData = new Hashtable();
for (int i = 0; i < keys.Count; i++)
{
metaData.Add(keys[i], values[i]);
}
SPFile destfile = list.RootFolder.Files.Add(fileName.Substring(fileName.LastIndexOf("\\") + 1),
fs, metaData, true);
if (destfile == null)
lit.Text = "Error in adding file";
}
}
catch
{ }
finally
{
web.AllowUnsafeUpdates = false;
}
}
}

NOTE: fileUpload is the control I am using in my code to get the file from the custom form. And the key and value fields are the column label and it’s value respectively. If you know the list column names then remove for loop and hard code the label, value while adding them to the metadata. Example like below code,

metaData.Add("Title", "DemoDocument"); metaData.Add("Version", "1.0"); metaData.Add("Author", "Praveen"); metaData.Add("ContentType", "Document");

NOTE: There is a problem in adding the list item through the code if your list columns are not inheriting from the data types string, int and datetime. You will get an exception. So follow this post to fix it.

Please post your ideas on it.

Read More...

Show data from list by using web services through SharePoint designer

When working on big SharePoint projects we will get really good requirements and most of them are the challenging. In some cases we need to show the data by pulling from one list in a site to another site. For example, on “Site A” home page we want to show the data from another site or sub site. Then we need to use the web services and call the list and get the items. This is what this blog post is completely about.

How is it useful?

It is useful because as far as i know we have two ways of doing this without coding.

1. By using IFrames on the web pages. Create a content editor web part on the page of the site where you want to show the list content and add a IFrame and point it to the page where list data is present in another web site. So it is just doing nothing, just loading that page as it is. But it’s not a right way of implementing it. Because if authentication is different then you need to login to the site 2 times. One for the main site and another for the IFrame. Another problem is we can’t use IFrames on the web pages for security and performance issues. So it’s not a good way of doing. Then is there any other way?

2. Yes, We have a solution for this and it’s the better way of implementing it. Call a web services by connecting to the web services of the site we want to get the data, pass credentials and then select the list or library you want. Add a web part to the page and give the list we pulled from web service to the web part. That’s it. It is the easy and efficient way. I am explaining this a little bit detail below.

  • Create a SharePoint web part page for showing content by using web services.
  • Open the page in SharePoint designer and detach it from page layout.
  • Now, we need to add a web part for showing the data. We have a very comfortable and most usable web part in SharePoint nothing but Data View web part. Add this to the page.
  • Now, web part is ready and need of data source. So we need to go for list where we have to pull the data. It can be in the current site, sub site or some other site.        SharePoint_XML_Web_Services
  • Go to Task pane menu item of SharePoint designer and select the Data Source Library. Now you are able to see the all the data sources exist in the current site only. For example, if you want to pull data from a list which is in subsite or some other site you can’t find the data source entry in the lists and libraries section. This is the time where we need to go for XML web services. See above figure for more details.
  • Now we need to add a new XML web service data source to our site. How difficult it will be? very simple, follow below.
  • Expand the node XML web services and you can find the link called “Connect to a web service”.
  • Now you are open up with a window where you can see the properties of the web service data source.XML_web_services_Data_Source_PorpsNow we need to set the properties for the data source. Move to the tag General. Fill the name, description and keywords. 
  • Move to tab source and fill the source description location. You can browse to the location of the site and add the which asmx file you want to add. Or if you know location you can manually enter the url. Usually the location is like this. http://sharepointserver/_vti_bin/lists.asmx?WSDL
  • Remember for example purpose, i given the lists.asmx file path, you can use any of available web service.
  • Before click on Connect now button, check the credentials. You have options for entering credentials, or use default/windows credentials or don’t ask for credentials etc. Set the credentials type and then say Connect now.

Login_DataSource

  • Now the command to configure. Our case, we want to show the data. So it should be the Select operation.
  • Select the port : ListsSoap [default one]
  • Select operation type : GetListItems
  • Now time to pass parameters to the data source. First thing is you need to select the listname parameter that from which list you need to get the data.You can pass parameters to the web service to filter the results too.  For example, if your list has the boolean column for IsUpdated. Then we want to show only the columns which are having IsUpdated=false, then you can set that here. DataSource_ParametersWe are done with configuring and setting the data source properties.
  • We have data source ready and we need to use it to link to the data view web part.
  • If you set everything correct, then you can see the entry under the XML web service section.
  • Click on the data source and choose Show data from the menu.
  • Now, it will open Data Source Details task pane that pull all the column information from the list that you connected to as shown below.

DataSource_DetailsSelect the  list of columns that you want to show on the page. [You can select multiple columns by pressing the control key.]

  • Now click on the option “Insert selected fields as” Multiple Item view. It will add all the selected columns to the Dataview web part.
  • Save the page and view it in browser.

How nice it is, and how simple it is?

Really thanks to SharePoint designer and Data view web part that makes our work easier.

Please post comments if you have any questions.

Read More...

Deleting the list items at a time from list using batch command in SharePoint 2007

When we are working with SharePoint, especially when programming we can discover plenty of things. Because SharePoint is a very big system and it is pretty much good for the developers and administrators that they can discover more and explore more. Its a lovely product and I really love it from Microsoft. From my experience in SharePoint i thought of placing all the SharePoint stuff in my blog. So now we are going to discuss about the performance and operations we need to perform in coding when dealing with SharePoint objects.
When we do SharePoint coding we need to think a minute about the efficiency and performance of the code . For example, if you take a person object, it has some properties around 10-15 and if you don’t dispose it from memory then we don’t get serious problems. But where as we dealing with SharePoint objects like SPSite and SPWeb etc, these are very big and occupies lot of memory than objects what we use every day in C#. So, don’t forget to use Using keyword for them so that they will be disposed and removed from memory when it’s scope is over automatically.
Another performance related is performing common operation to all list items in a list in SharePoint. Usually we will write a foreach statement to loop through all the list items in a list and perform operation. i.e. add, edit or delete. When we need to update all the list items and columns, data varies dynamically then  we need to write a update command inside the foreach statement. But when we know that we are updating the same columns with same data for all items then we shouldn’t follow the code by writing update statement in foreach statement. Because it will be very costly and time taking. It will use lot of resources multiple times. So for this, Microsoft SharePoint team already integrated beautiful feature in SharePoint system. i.e. called Batch programming for doing operations on list items. This is the most efficient way of deleting or updating in a single list.
Below is the code i use to delete all the list items in a SharePoint list.
private void DeleteAllItemsUsingBatch()
{
using (SPSite site = new SPSite("http://mySharePointServer"))
{
SPWeb web = site.OpenWeb("/");
SPList list = web.Lists["Links"];
StringBuilder sb = new StringBuilder();
sb.Append("");
string batchCommand = "<method><setlist scope=\"Request\">" + list.ID + "</setlist><setvar name=\"ID\">{0}</setvar><setvar name=\"Cmd\">DELETE</setvar></method>";
foreach (SPListItem item in list.Items)
{
sb.Append(string.Format(batchCommand, item.ID.ToString()));
}
sb.Append("
");
web.AllowUnsafeUpdates = true;
site.RootWeb.ProcessBatchData(sb.ToString());
web.AllowUnsafeUpdates = false;
web.Close();
}
}
Below is the code for deleting all library items in a document library. Remember, the only difference is for document libraries we have files as the primary fields. So, we should specify the file name along with the other parameters. So, find below code.
private static void DeleteAllItemsUsingBatch()
{
using (SPSite site = new SPSite("http://mysharepointserver"))
{
SPWeb web = site.OpenWeb("/");
SPList list = web.Lists["Documents"];
StringBuilder sb = new StringBuilder();
sb.Append("");
string batchCommand = "<method><setlist scope=\"Request\">" + list.ID + "</setlist><setvar name=\"ID\">{0}</setvar><setvar name=\"Cmd\">DELETE</setvar><setvar name=\"owsfileref\">{1}</setvar></method>";
foreach (SPListItem item in list.Items)
{
sb.AppendFormat(batchCommand, item.ID.ToString(), item.File.ServerRelativeUrl);
}
sb.Append("
");

web.AllowUnsafeUpdates = true;
site.RootWeb.ProcessBatchData(sb.ToString());
web.AllowUnsafeUpdates = false;
web.Close();
}
}
Explore more and know more. Happy Coding!!! Read More...

Sherlock Holmes' Insights on B2B Marketing Data


"Is there any point to which you would wish to draw my attention?"

"To the curious incident of the dog in the night-time."

"The dog did nothing in the night-time."

"That was the curious incident," remarked Sherlock Holmes

Sherlock Holmes makes an interesting point (in the short story "Silver Blaze") about the conspicuousness of zeros in the data. They can, as in this story, lead to interesting investigations and important conclusions.

However, in the busy world we all live in, they can often be overlooked, as they generally do not appear in most analysis, due to the fact that they are by their nature non-events.

I was reminded of this in a conversation with Mark DiMaurizio regarding visual clickthrough analysis in email. Often, in looking at tabular or chart views of email clickthrough analysis, we will only see the links that were clicked on. If zero links were recorded for a click, either due to a technical error such as a broken link, or due to that call to action not engaging the audience at all, that is very valuable information that is being missed.

A very valuable and insightful question to ask whenever looking at any marketing data is "what is NOT being shown" as the insights based on absence from the data can often be as valuable as insights that the data itself shows.

As we look more and more to analyze our efforts in social media, and the efforts and forums that are driving interest in our solutions, we need to keep asking deeper questions of our data in order to draw out the insights from it.
Read More...

How to change the server name which has SharePoint 2007 installed?

Introduction:

Everyone is using and liking SharePoint these days because it is matching a lot of requirements we have. This is a cool and nice product released from Microsoft and we need to know about it and complete administration. I am working from long time on SharePoint and few months back i was trying to change the system name of the server in which SharePoint was installed, i learnt lot of things from it. But before discussing about it, i want to put something about what are the situations for renaming the server?

  • We are developers and will work on ASP.NET, SharePoint, Silver light etc…. SharePoint itself is a very big system and if you install directly the product on the server means development environment then SharePoint installs some services. If you want to work on ASP.NET application tomorrow, then all SharePoint services will run by default. So whenever we don’t want SharePoint to be running on the server then we manually go to the services and stop them. Its not a good way of doing that. So, what i propose here is to create a VHD which has the SharePoint, Sql Server and related software needed for SharePoint. by providing some extra memory to your system i.e RAM around 2-3GB, it will run fast. When SharePoint is needed then only we will run the VHD otherwise not.  So, in our office we have plenty of SharePoint developers and everyone will do the same creating VHD and install SharePoint. But  it’s a very time consuming and long process. Installing will take plenty of time. So, we will create a single VHD and copy it on some share location. From there all devs, admin, team will get the VHD and work on it.

So, till now did you find any problems? no, it was a simple process. When will we get the problem? if we want to add the VHD to network. Now the problem starts…. We have a very big task that we need to add the server to domain and everyone will access the SharePoint server of each other. But we copied the VHD and installed on our system, so all VHD’s will have the same names. So we can’t add the servers to the domain because of having same names, network conflicts will come. So finally we need to perform operation to change the server name.

Steps:

Here we go: by following the steps below we can complete this operation.

  •    Changing the Alternate Access Mappings:
  1. How to: Browse to SharePoint central administration, operations tab from the top navigation and under the section Global Configuration, select Alternate Access Mappings.

          Alternate_Access_Mappings

  1. Click on the Alternate access mappings and here select the mapping collection to select all as shown in below figure.

          Show_All_Sites

  1. Now, you can see all the mappings[urls] for the existing sites in SharePoint system. This is where we need to change the urls of the sites which points to the new urls. Click on the “Edit public urls” link on the top sub nav menu on the page. Here select the site from mapping collection on the right side one by one and map their urls to new server name as described below. For example, if your old server name is oldMOSSServer then the urls would be like this.
  2. http://oldMOSSServer/, http://oldMOSSServer:8080/, http://oldMOSSServer:4534 etc…..
  3. And if new server name is newMOSSServer then the url’s we need to changed to are as follows…
  4. http://newMOSSServer/, http://newMOSSServer:8080/, http://newMOSSServer:4534 etc…..

Note: Don’t change any port numbers in the above operation, just change the server name only.

  1. Repeat the above steps for all sites until everyone points to new server name.
  • Rename the SharePoint server by using STSADM tool
  1. Go to start –> run and type “cmd” to open the command prompt.
  2. Move to the path where STSADM.exe file is resides. Usually it is in 12hive\Bin folder.
  3. Command we need to run is renameserver. Syantax is:

     stsadm -o renameserver -newservername "newservername" -oldservername "oldservername"

  1. Change the strings highlighted in red to the new server name and old server name respectively and run the command. Remember by performing this operation means we changed the server name only within the SharePoint system but not on the server. This will only replaces the SharePoint system entries old server name with new name in SharePoint databases. Because in SharePoint everything will store in database.
  • Rename the original system name
  1. This is the simple operation generally all of us know. Go to Start button, right click on the Computer option and select Properties.
  2. Click on the change button type the new name for the server and [add it to domain if it’s not already in] and save the changes.
  3. RESTART your server.
  • Change the System and SharePoint credentials
  1. Change the credentials of the SharePoint server by running updatefarmcredentials command from STSADM tool.
  2. Check once all IIS application pools points to the old system name, if any change them and restart IIS.

That’s it!!!

Do you think we are completed with the process? NO, last and final step, TESTING: please open the central admin site of the SharePoint and check everything is working fine!!!

Remember, if any thing is broken or not working as expected, then you are always welcome to revert back the changes you did. If central admin site is not opening then you can do the below steps to revert back to initial state.

1. Rename your server name and [remove from domain and add it to WorkGroup if it was not in the domain before].

2. Rename the SharePoint server by running STSADM command to old server name.

3. Change the Alternate access mappings to revert back to old server name.

4. Restart the server and check the changes.

5. Change the credentials, RESTART IIS.

6. You should be back with all settings as before.

Please post your comments here if you have any questions  or issues.

Note: I was successfully performed all the steps as i said few months back, but forgot to change the Alternate access mappings. So what was the result, i always frustrated and felt bad that server change operation failed. After a long research and study we got it working. Hope this help others….

Read More...

Lead Scoring - Providing Disposition Options

Handing scored and qualified leads to sales in order for them to follow up is an inexact science. Continual optimization of the process is necessary in order to understand what aspects of a buyer’s digital body language are key to understanding buyer’s intentions. One of the best sources of this information is the sales team themselves. However, as any marketer knows, getting information out of a sales team can be challenging. On a topic such as the quality of an individual sales lead, it may in fact seem almost impossible.

Recently, we talked about the need to implement a claw-back system for scored leads that are passed to sales and show no sales activity. This approach is useful, but is a black and white system. Either a lead is good and is worked by sales, or it is not good and is ignored and quickly clawed back. There is no opportunity for middle ground, and no opportunity for feedback from sales.

If, instead, the sales team is presented with disposition options for the lead that feed directly into appropriate lead nurture campaigns, the best of both worlds is achieved. By having an option to pass a lead back to marketing, with a specific disposition that guides what will happen next, the sales person is able to maintain ownership of the lead. However, by carefully constructing the set of disposition options, marketing can learn much more about why the leads were not accepted by sales than they ever would have by asking sales to fill out a feedback form.

For example:
- If a lead is deemed by sales to be slightly too early in their buying process, they might enter the lead into a “Late Stage Buyer Nurturing” lead nurture program that provides case studies and ROI analysis to guide a prospect towards buying

- If a lead is deemed by sales to be too junior to make a buying decision, they might enter the lead into a “Convince Your Executive Team” nurture program that provided key information to make an internal business case for your solution

- If a lead is deemed by sales to be more interested in an alternate product (Product B), they might enter the lead into a “Product B Nurturing” program

These are only a few examples, each organization will have different options. The technique of providing these “lead disposition” options to sales in order to provide sales with a “middle ground” option, while at the same time providing rich insights to your marketing team as to why a lead is being rejected is extremely valuable.

This question is one of 8 critical lead scoring questions to consider when thinking about a lead scoring system. Read More...

Adding master page to a SharePoint web page

In SharePoint, when we add a new page from browser by going to Site Actions and select Create Page then there we will select the page layout and create the page. So we are selecting the page layout, its always binds to a master page. Whenever you change the default or custom master page, then the changes will be applied to all the page layouts by default. Because SharePoint won’t give direct master page url or name any where. it will be ~masterurl/custom.master for custom master page and ~masterurl/default.master for the default master page.

But when we add a new ASPX page from the SharePoint designer, there we are not having any option to select page layout and inherit it.  So, we will create a simple ASPX page where nothing is added by default to the page other than the head, body tags. When we try to add the master page to the current page then we need to follow some steps to apply master page to the current page. Below are the steps to follow to add master page to the aspx page we just added.

  • By default you will create the ASPX page in pages folder if your site is using publishing site template otherwise create page in the root of the web site.
  • You can create site by right click on the pages folder or on the root site and then select New –> ASPX page. The empty ASPX page when we add will look like as shown below.
  • Empty_ASPX_Page_No_Master
  • We need to perform below steps to add master page to the current page.

1. Add master page reference to the page.

2. Add Placeholders required to the page depends on the master page.

1. Adding master page reference to the page.

  • Now, we need to add the reference to the master page. We can do this by adding below tags.
  • For publishing site template pages, we need to add two entries.
  1. reference master page

    <%@ Reference VirtualPath="~masterurl/custom.master" %>

  2. Adding <%@ Page tag which will inherits from the publishing page.

    <%@ Page language="C#" Inherits="Microsoft.SharePoint.Publishing.PublishingLayoutPage,Microsoft.SharePoint.Publishing,Version=12.0.0.0,Culture=neutral,PublicKeyToken=71e9bce111e9429c" meta:progid="SharePoint.WebPartPage.Document" %>

  • But for team web site we need to add page directive as follows.

    <%@ Page language="C#" MasterPageFile="~masterurl/default.master"    Inherits="Microsoft.SharePoint.WebPartPages.WebPartPage,Microsoft.SharePoint,Version=12.0.0.0,Culture=neutral,PublicKeyToken=71e9bce111e9429c" meta:progid="SharePoint.WebPartPage.Document" %>

2. Add Placeholders required to the page depends on the master page.

  • This is the very crucial and important step we need to perform. Because placeholders are not same for all site templates. This again will depends on the master page as well. If you customized the master page then you need to add placeholders depends on the placeholders you define.
  • Here i will tell you the main or default placeholders we need to add to a page.

    <asp:Content ContentPlaceHolderId="PlaceHolderPageTitle" runat="server">
    </asp:Content>
    <asp:Content ContentPlaceHolderId="PlaceHolderPageTitleInTitleArea" runat="server">
    </asp:Content>
    <asp:Content ContentPlaceHolderId="PlaceHolderTitleAreaClass" runat="server">
    </asp:Content>
    <asp:Content ContentPlaceHolderId="PlaceHolderPageDescription" runat="server">

    </asp:Content>
    <asp:Content ContentPlaceHolderId="PlaceHolderBodyRightMargin" runat="server"> </asp:Content>

    <asp:Content runat="server" ContentPlaceHolderID="PlaceHolderMain"></asp:Content>

Placeholder “PlaceHolderPageTitle” is for setting the page title of the current page.

Placeholder “PlaceHolderMain” is the main and important placeholder where we keep all the content on the page. Place all the content of your web page in this placeholder.

Note: This blog post is only for letting you know about how to convert sample ASPX page to SharePoint web page by applying the master page to it. All the placeholders and the content is not 100% match your master page. If you have any problems in applying any step then please provide a comment, so that i will look into it.

Read More...

JQuery integration in SharePoint

As we are well experienced with the JQuery in ASP.NET applications, JQuery is a client side script for executing really impressive logics, calling server-side methods, animations, smooth rendering etc.

SharePoint is a platform and which is built upon ASP.NET, so we can do all the stuff in SharePoint which we implemented in ASP.NET applications. Here is a small walk through of how to integrate the JQuery in SharePoint applications. We usually write lot of logics by using JQuery to get data from server using Ajax implementation by calling Page Web methods and render the data by using JTemplates etc… But We can’t implement the same in SharePoint because we can’t wriite page web methods. Reason behind is SharePoint don’t support page web methods because it is build with ASP.NET 2.0 version. Other than that you can implement all the logics in SharePoint as well.

Follow the steps below to integrate JQuery into SharePoint.

  • Open your SharePoint site in SharePoint designer.
  • It’s always better to organize your data and files in good structure. So create a folder for placing all scripts named “Scripts” if it does not exist.
  • Now copy the Jquery script file to this folder. I am using the file jquery-1.3.1.js.
  • Create an ASPX page in your pages folder of the site if it is a published web site template otherwise create a page in the root of site. [However, any place it works.]
  • This page is not a web part page, we are just creating a simple ASPX page for JQuery integration.
  • Here, add a reference to the JQuery java script file to the head tag of the page.
  • Add the below code to test the Jquery functionality to body of the page.
  • <script type="text/javascript">
        $(document).ready(function() {
            $("#cb").live('click', function() {
            $("#lblMessage").text("you clicked on CheckBox, selected = " + $("#cb").attr('checked'));
            });
        });
    </script>

    <input type="checkbox" id="cb" />
    <label id="lblMessage"></label>

  • We just wrote a very small piece of code snippet for testing the JQuery functionality. This post main goal is to integrate the JQuery plug-in for SharePoint. The same way you can add reference to the master page of the site to get the advantage of JQuery in all pages of the web site.

  • We can apply the master page to the current ASPX page by following this post. This will give you the same look and feel as other pages.

  • We have plenty of ways to do this. For example, for simple integration purpose i explained you to place the JQuery file in scripts folder of the root of the web site. But good way of doing is, placing the file in Layouts folder of 12 hive in SharePoint system. This way you can access the file in any site and on any page through out the SharePoint. Because Layouts is the common sub site exist for all the sites.

  • Adding script reference to all the pages in a site:

  1. Add the script reference to the <HEAD> tag of the master page of the site. So that all pages have the reference to the JQuery script and you can use it any where.
  2. Syntax: <script type=”text/javascript” src=”/_layouts/scripts/jquery-1.3.1.js”></script>
  • Add script reference to specific pages:
  1. For this we have a good and nice web part to add html/script/css. That is nothing  but Content Editor Web part. We can add a content editor web part on the page [most probably on the top of page] and we will add the script reference code to it. Now the JQuery is available only for the pages where you added the code.
Read More...

Unsubscribes and Content Relevance in B2B Marketing

Another great chart from MarketingSherpa shows very clearly what we as marketers have long known. Relevance is key. 58% of those who stop reading, disengage, or unsubscribe quote a lack of relevance as a key factor.

Too many people are still looking at unsubscribe rates as a relevant metric to determine whether marketing messages are connecting with an audience. The fact is that only some of your audience will unsubscribe. The rest will tune out, emotionally unsubscribe, or even report your message as spam if it loses relevance.

So what is relevance and what can we do as marketers to better align our communications with what is relevant to the audience. There are four main areas we need to focus on in order to connect with our buying audience:

Relevance to their Business: This is one of the more often focused on aspects of relevance, and in many discussions around segmentation, this is all that is considered. Industry information can tell us whether they are likely to be experiencing pains we can solve, company size will give an indication of the resources they may have to tackle that pain and the size of a challenge it might be for them. Geography can give us an indication of whether the cultural or regulatory environment makes the business pain more (or less) acute. To do this, we first need to get our marketing data cleansed continually so that we can easily define our target segments based on industry or geography.

Relevance to their Role: Now it gets interesting. Knowing what role a buyer plays in the buying process allows you to target your message much more accurately. Are they a technical evaluator? If so, product details, devoid of marketing speak may be best. Are they an economic buyer? Perhaps ROI oriented case studies might be best.

Relevance to their Stage in their Buying Process: We’ve all received marketing communications that were driving towards a deal when we were just educating ourselves on the industry, and vice versa, we’ve received introductory, high level content when we were almost finished a detailed evaluation. The mis-match is painful as the content is not relevant even though we do have a certain amount of interest. Matching stage in a buyer’s buying process is crucial to relevance, and to do this, we need to map the buying process and what aspects of Digital Body Language indicate a buyer is at each stage.

Relevance of Style: Each audience responds to different styles and content. Where should the call to action be? What copy or subject line works best? There is no better answer to this than actual prospect response, and the use of A/B testing is your best option to understand which style is most relevant and effective with your audience.

Keeping unsubscribe rates low is great, but keeping audience engagement high, and emotional unsubscribes low is even better. The only way to accomplish this is through a relentless focus on making your message relevant to your audience across each of the key dimensions. Read More...

Lead Scoring - the Importance of Clawbacks

Much of the discussion around lead scoring is focused on the passing of the lead in one direction; from marketing to sales. However, equally important when scoring leads is to consider what happens when a lead is passed to sales and there is no appropriate follow-up from sales. Assuming that a good job has been done on understanding the buyer and his or her buying process, the lead that was passed to sales can be assumed to be a good lead.

If sales does not follow up with the lead in an appropriate amount of time, there is a good chance that your opportunity for connecting with that lead will disappear or diminish. When this happens, you want to ensure that the lead does not slip through the cracks to become a dead lead. Significant investment has been made in getting the lead to that point, and losing that investment through inattention is an undesirable outcome.
One of the best options is to implement a “clawback” system with sales whereby a lead is clawed back from them if they do not act on the lead within a defined amount of time. Defining this “service level agreement” with sales is a topic for another day, but a key tenet of it is the idea that if sales does not begin to work with a lead within a certain amount of time (2 or 3 days is ideal), marketing will pull the lead back from them.

Activity can be defined in a number of ways. You will want to automate this process, and your leads will likely be in your CRM system by this point, so the two most accessible options for defining activity are:

a) Sales activity logged against the lead record in the CRM system
b) A change in the status of the lead or opportunity

If activity is not seen in the predefined amount of time, the lead should be clawed back. When it is clawed back, one of three things can happen to the lead:

1) The lead can be passed to another sales rep in the field
2) The lead can be added to a lead pool with a “first-come-first-served” system of allocation whereby the sales team can compete for it
3) The lead can be passed to a partner channel
4) The lead can be re-added to a nurture program until indications of buying interest resurface at a later date

The addition of a claw-back process after you have scored and handed off your leads to sales provides an encouragement to sales to work the leads quickly. However, this overall process relies on the leads being well scored and ready to purchase in order for the sales team to remain motivated to follow up. Furthermore, it relies on there being an agreement between sales and marketing as to what a qualified lead should look like, what lead score defines a lead as being sales-ready, and what timeframe sales has for follow-up.

With those criteria in place, the hand-off between marketing and sales becomes significantly more efficient due to the competitive dynamics of the overall system.

This question is one of 8 critical lead scoring questions to consider when thinking about a lead scoring system. Read More...

Pigs Flew, Hell Froze Over and the Eagles Got Back Together

MAJOR NEWS ANNOUNCEMENT!

Madeline is coming to live with us! Yes, this is for real. Yes, I’m as shocked as you are. Yes, this was instigated by Cruella. Yes, I’m excited, but also very nervous.

If your mouth is closed now and you are no longer staring wide eyed and have been able to process this you may have now realized how incredibly CRAZY this is. A total complete 180 degree switcharoo from a few months ago when I was incapable of even FLYING with the children. Granted this isn’t about me…its about Maddie and her father, but in making that decision I’m sure the parenting ability of yours truly was under great scrutiny?

So, we’ve already started the process, submitting the paperwork and working on our end for the schools, doctors etc….

CAN YOU BELIEVE THIS????

So, while I just have a few minutes to type as we are on a one hour wireless internet option and John still needs to access his facebook…I’ll type more later once we have more details. Read More...

The running journal from Germany

April 28
We arrived in Germany today. After our near month long excursion from Arizona to Los Angeles to Portland then back to Los Angeles – stopping in Chicago, staying a weekend in Washington DC and then back to Chicago before an overnight trip to Frankfurt.

The plane flies over a lush green with typical red roofs that I remember so well from my years in Italy. I peer over sleeping Bella and out the window to watch the passing landscape and feel a giddiness creep over me that puts a giant grin on my face.

The plane trip was uneventful. Bella was a champ. She fell asleep about an hour into the flight and slept the entire way. John and I, on the other hand…were exhausted by the time the plane set down. We’d both managed one or two hours of restless sleep and the thought that we still had a 4 hour bus ride to Vilseck just about made me cry.

We landed…baby girl was awake now and so happy to be up and around. We let everyone else deplane before trying to unload our piles of stuff. We knew we still had to get through customs so we had some time to wait anyhow. It was surprisingly easy. Passports – Military ID….done with customs. Not even a question. On the other side we found the Army guy…attaché…liaison…whatever you want to call him. He had our name and took us over to the Sheraton where the Army has a sort of waiting room for incoming soldiers. We asked about the bus and he told us that families weren’t sent on the afternoon bus…only the morning bus so we’d be spending the night in Frankfurt. They set us up in a room and it was perfect timing. Bella was ready for her morning nap and we were ready to sleep too. We all crashed around 1PM and slept until about 5 PM. Still woozy, we decided it was more important to find something to eat and besides….Bella was ready for a few hours of wakeful playing. We headed to a nearby grocery store….Oh salami! Oh delicious bread! Oh marvelous cheeses! Oh yogurt of splendid smooth texture with hazelnuts! Oh cheap chocolate that makes Hersheys taste like wax! It was a decadent first taste of Europe.

Interrupting my narration here, I have to say that I was quite nervous about my first experience with a “real German-speaking person.” Having my phrase book packed in my shipped baggage had made it near impossible for me to “brush up” on my German vocabulary. That combined with the lack of sleep made it impossible for me to understand a word they were saying. So, I left it to John, who with his masterful charm smiled his way through our first Euro transaction.

So here we are….exhausted and its only 8PM. Luckily we’re ALL exhausted….Daddy, Mama and baby girl….g’night.

ETA: We all slept until 7AM the next morning. Eleven hours….we were exhausted!

April 29
We’re in Vilseck! John got to be the bus commander. He was in charge of making sure that all bus riders made it to Vilseck. It was a large chartered bus (think greyhound)…there were four people on it….thats including the baby. Ha ha ha. Aside from us there was a medic named Michelle. She was nice. Guess where the bus driver stopped for lunch? Yep….McDONALDS. Ugh. I suppose that’s safe for them….how do they know if we are adventurous types that don’t really want a big mac to eat, but prefer a donnorkebab or a salami sandwich?

OUR SPONSOR
We met Charlie – the guy that John is replacing here. He seems nice…drove us all over and carted our baggage to guest housing for us. He is our sponsor…which means he’s supposed to help us get acquainted with our surroundings.


ROSE BARRACKS…
Its beautiful. I’m so reminded of Oregon. Its green with lots of trees and grass…so different than Sierra Vista and Fort Huachuca. I’m thrilled that I get to enjoy flowers and trees and have a vegetable garden…..(deep intake)…dare I hope?

LIVING QUARTERS…
Did I mention that we’re in overflow guest housing? Lets just say that is about as bad as it can get for longer term housing…the overflow part to be sure. We walked into our little one-bedroom barrack and found a bed in the living room, appliances that don’t work, two refrigerators and no pots or pans…it kind of reminded me of a rarely used storage room. But, we’ve made do…the extra appliance is now an end table, the living room bed is actually more comfortable than the couch to read or write or play with Bella and I AM glad we have a full size fridge.

There is no room for a borrowed port a crib so Bella is sleeping (quite soundly) in a suitcase. She fits perfectly and I’ve removed the extra bedspread from the living room bed to use as padding under her. It quite conveniently slides under the bed after she wakes up each morning and I put her in the bed to nap during the day.

So, my first day in Germany and I still feel quite like I’m still in America. It’s a strange feeling…

May 1
Its Friday! Woohoo! I’m not sure why I’m glad for that….its not like John is back to work yet, but for some reason a Friday always makes me feel like a relaxing weekend is just around the corner.

We too the local Post to Post bus to the larger Army base about 25 miles away. Its called Graefenwohr (Graph-en-veer)…or just Graef to us locals. (ha ha ha) We wanted to check out the PX (the big store that only army peeps get to use) and since they don’t have one in Vilseck at the Rose Barracks we decided to check it out.

It was a bad day for travelling. Well, it started out good…we took the “slow” bus so we could wind our way through all the little villages. That was really interesting to see the houses and the small towns and the bike paths. We made lots of plans for fun outings in the future. When we got to Graef, they scanned our ID cards and John’s wouldn’t register. Sooooo….we all get off the bus at the gate and John has to sign in as a visitor…then finds out that the PX is about a 2 mile walk. Good think I wore tennis shoes…and we brought the baby carrier.

We walked to the PX…walked around it….didn’t find much different from the one at Ft. Huachuca…I was thirst, baby girl was grumpy, John was grumpy. Ugh.

Wait another 45 minutes for the “fast” bus only to find out it’s the “slow” bus again. Driver informs us that they changed the route schedule in April….we’d been given the OLD schedule….

Long drive back to Vilseck where John’s ID still doesn’t register, but they allow us on post anyway.

Glad to be back in our little “home.”

May 6
ARMINESS *^&%$^^*^&*%$^
Did I say all that with my outside voice? OK…so we were really looking forward to living in a German town and experiencing local “flavor.” Little did we know that the devious working of Army housing would keep us from our dreams. When we went to the housing office John was given two sets of keys to two houses in a town called Auerbach. On the map it seems that its just down the road. We called Ed, our new friend, to drive us out there.

Ed’s been acting like our sponsor. Charlie has unfortunately turned out to be a real dud in that department and Ed’s been a lifesaver for helping out.

Anywho….Ed starts off driving and we keep driving and driving and driving….and 40 minutes later we get to Auerbach….you’ve GOT to be kidding me. That was 40 minutes in a car – driving at top speed in the daytime with no traffic. I am NOT excited about that commute.

The houses are interesting…They are located in the “American housing section” of the town….so the whole neighborhood is military. Back to the houses…both three stories with a beautiful view of farmland…no yard to speak of, however and some trash-accumulating neighbors. Such a turnoff from what I was hoping for from a small town existence. We just weren’t impressed.

So we go back to see Ed’s house. Its BEAUTIFUL. He lives in Sorghof in the “new housing” built by the Army. I’m all for trying to get a house there. We head back to the housing office and are told that’s not an option. We can either A) live in one of the two houses in Auerbach or B) live on post. We decide to take a look at the post housing.

So, for the pros and cons we came up with the following:
Post Housing Pros – John can come home just about any time he wants during the day, conveniences of the bus, commissary, classes, childcare, etc.
Post Housing Cons – its on POST.

Auerbach Housing Pros – beautiful German field as back of house view, typical German house with peaked room, garage,
Auerbach Housing Cons – trashy neighbors, no yard, 40 minute drive one way, bus only available 2X per day

Today was the day…we had to make a decision. We decided that we were adventurous people and we can live on post and still have a good overseas living experience. So, John tells them we want the on post housing. Guess what….they tell us that the lady helping us made a mistake. She’s only allowed to offer ONE house to us. The only house available to us is the smaller Auerbach house.

We don’t want it.

Then we won’t pay for your temporary housing.

Oh, you mean that storage room you call temporary housing in the overflow barracks? Why not?

If there is something available on the list now you HAVE to take it or the Army considers you as having “declined” housing.

But we didn’t decline housing. We decline A house…we want the one on post that’s available.

So, on continues this conversation to where they finally agree to give us on post housing, but WE HAVE TO PAY FOR OUR OVERFLOW APARTMENT - $50/NIGHT. Yes, that’s in all caps because when I think about it my blood boils and I have to yell it. John makes a deal with them that they pay up until the Auerbach house could be livable…ie when temporary furniture could be delivered. So, we’re out the cost of about a weeks worth of housing in this overflow storage room because they offered us three houses and we wouldn’t give up and just take the crappiest one of the three and we STILL end up on post and without the cool German house we were hoping for. And thus you can understand my *&^$#%^ ARMINESS remark at the beginning.

Oh…and we can’t just take a house on the economy because if we decline a house the military can offer they won’t give us the housing allowance to help pay for the house on the economy….

I’m at peace with the idea of living on post….and am lodging a formal complaint with the Inspector General to try and get our weeks worth of temporary lodging paid for. (Can you see the stubborn side of me coming through?) Lets just say this story is NOT over yet.
May 7
Happy Birthday daddy-o! My dear old dad turns 64 today and all he got from me was an email and one of my favorite memories of him posted on this blog. (He knows I’m a little strapped for sending B-day cards/gifts right now so I think he forgives me…)

Ahem….one of my favorite memories of my dear dad is his storytelling abilities. He’s skilled with making up tall tales and reciting them in the most fascinating of ways. But one of our favorites as kids was when dad would tell us that “we’re going on a bear hunt….and I’m not afraid.” Then he’d ask each of us if we were afraid and we’d all shake our heads NO while inside there was just a twinge of fear and excitement for the unknown of where this bear hunt was going to take us. It didn’t matter that we’d gone on this bear hunt numerous times over the last few years…it was still just as exciting as it was the first time I went on the bear hunt. I look forward to the first time Bella gets to have Grandpa Johnnie take her on a bear hunt….I may just go with them. I’m not afraid.!

PS…its also Garretts birthday….he’s 8 this year! Happy Birthday Garrett!

May 7
Today I attended the Nursing Mothers/New Parent Support Group. I decided I needed to get out and meet people now that I’ve been here a full week AND I know where I’m going to live. So, I get the information and head to the Chapel Annex for the class. It’s a good 20 minute walk from where we are at. I walk into an annex and it was more like a short hall with storage closets on each side. Hmmmmm….this can’t be right…so I continued to the main chapel and finally found the Sergeant in charge who told me there was no class like that at the chapel - *sigh* So I head back to ACS (Army Community Services) and what do you know…the class is right there. DUH.

It was interesting…there were 6 people in there…a pregnant couple and then a bunch of moms with babies. I can’t remember all their names, but three of them are almost exactly Bella’s age! I love it. One little girl born on Nov. 11th, a boy born November 18th and another little girl born December 16th. What fun for her to have so man friends so close in age…of course, by the time they actually get to “playing” status they may have transferred out of Germany, but until then I enjoyed talking about where each child was currently at developmentally and getting tips on transitioning to solid foods!

May 9
Volksfest! My first German-American Volksfest…which is really just slang for “crappy American carnival with better food.” We walked into Vilseck and there were a few carnival rides and game playing booths, a historical walk with a bunch of old-time military (German and American) trucks and tanks and a Bradley (John’s favorite military vehicle) AND the infamous beer tent of Germany.

It was pretty fun to walk around and just watch people. I had my first donnor (basically a pita with meat and veggies and a yummy sauce) and as we sat eating I watched a little German baby boy about 18 months old dipping his fingers in his mom’s beer and then would suck the beer off his fingers before dipping them in again. His mom didn’t notice at first, but when she did she actually moved the beer glass so it was EASIER for the child to get at it with his hand. I guess I’m not in Kansas anymore, am I?

Speaking of Kansas – Vilseck is an adorable town. See...?

May 13
Happy 6 month birthday to my sweet baby girl! Can you believe she is already 6 months old? Me either.

She still stands like a champ, but refuses to learn to sit. Its like her body doesn’t know how to bend at the waist. She automatically straightens up stiff as a board. It makes me laugh. She is, however, getting ready to crawl. Up on her little hands and knees and she rocks and then pushes herself backwards. This is often while she is hollering bloody murder because she hates to be on her tummy for too long.

She is a babbler…can’t keep quiet for anything. I think she gets that from her dad, who I often have to insert a “cut the string, chatty cathy” for him…especially during movies.

She is starting on solid foods. So far she loves Gerber Biter Biscuits (which, darn it all, I haven’t been able to find here.) and she also loves bananas. I’m going to start her on peas soon then some other foods like yams, etc. It’s a little hard when you have practically no kitchen utensils to get that puree down, but we’re working on it. And in the meantime…she still loves her milk.

Just on a mushy side note…who ever thought I could love a little girl so much? She really is the most fantastic miracle and I don’t know what I’d do without her!

May 13
Phone call late tonight – this post deserves its very own section…title….so look for Pigs flew, Hell froze over and the Eagles got back together… Read More...

The Medium is the Message: B2B Marketing, Social Media, and Conversation Context

Ever since Marshall McLuhan published his 1964 book Understanding Media: The Extensions of Man, we have been familiar with the phrase "The medium is the message". The medium in which a message is delivered has as much of an effect as the content delivered in that medium.

Today's world of social media needs to be viewed in a similar light. The medium in which a message is delivered is as important to consider as the content of the message itself. As today's marketing organizations consider how to engage in social media forums, not considering the context of the medium being used can lead to a significant change in the overall effect on the message being received.

Social media sites are best compared to social functions in order to understand this effect of context. Like social functions, each social media context has a certain "vibe", which guides the conversation that is accepted.
If you think of this in light of social functions, it becomes clear that certain conversations are appropriate in certain social contexts, and not others. Business meetings are intended to be direct, on topic, and have an agenda-driven vibe. On the other end of the spectrum, parties have a social vibe and only social conversations are generally seen as accepted.

The use of social media for business shares a similar dynamic. The context guides the vibe, which guides the conversations that are accepted. Content of conversations cannot be understood separate from the context of the medium in which they are held.
I wrote recently about our Facebook B2B marketing strategy, which leveraged the context of the Facebook medium to guide the message being delivered. Facebook carries a very social vibe, and our goal with our Facebook strategy is to keep our conversations relevant to that context.
In the context of LinkedIn discussions as a medium, however, "talking shop" is much more expected and anticipated, so the messaging in that medium can be much more business oriented.
If a person at a party brings up a business conversation, or a person in a business meeting tells a party story, the audience is as much influenced by the out-of-context feeling, based on the medium, as they are by the content itself. It is the same in social media. A whitepaper on best practices may contain excellent content, but it is the context of the medium as much as the content itself that influences the audience.
When exploring B2B marketing in social media, understanding this context is critical to understanding how your message is received. McLuhan's prescient observation that the medium is the message is as accurate today as it was years ago.
Read More...

Cherry Picking of Leads: B2B Marketing to Sales Handoff

Should we allow sales to cherry pick leads that, based on lead scoring, we have deemed not to be ready for sales?

Steve Kellogg at Astadia raised the question very aptly in his Endless Lead Loop post, and it's a question we all face as we wrestle with the business process of lead scoring and handing leads from marketing to sales. Let me start by saying that there is no right answer here, and businesses that have consciously decided to allow cherry picking are not necessarily doing anything wrong.

However, I would make a strong argument for "No."

The better we get at lead scoring, the more factors we are able to consider. We look at multiple dimensions of lead scoring to split the "who" from the "how interested", we look at multiple components of a score and allow each component to only contribute a maximum amount, and we take time into account by degrading lead scores over time. Over time, as we work with sales, we are able to build a fairly accurate picture of what matters to them in a lead.

However, there will always come a time when sales is not getting, in their view, enough volume of leads, and they will ask to open up the funnel so they can "cherry pick" the leads that they deem good. Sounds harmless, as some might turn into opportunities, and those that don't can continue to be nurtured.

It is, unfortunately, not a harmless activity. If we are connecting sales with buyers who are too early in their buying process to be ready to talk to sales, we run a very real risk of alienating those buyers and pushing them away. Despite our good intentions, this cherry picking activity can have significant negative consequences, as prospective buyers who might be good opportunities later can disconnect from an otherwise promising education process early in their buying cycle.

Better than allowing cherry picking, is to keep with the same scoring methodology, but open the funnel slightly. If an A-Lead is passed to sales, and 80-100 points is deemed to be an A-Lead, then keep the same process in place, but open the funnel up so that a A-Lead is now from 60-100 points. By doing this, we prevent sales from negatively impacting early-stage prospective buyers, but still allow them more leads in the funnel.

This question is one of 8 critical lead scoring questions to consider when thinking about a lead scoring system. Read More...

File System path for Adobe Connect now in Windows Vista

Hi guys,
Today i have a demo with one of my client to show the work i did. One of my colleague try to connect to Acrobat Connect now from his system [i was busy with some other work at that time]. And it was not opening for him, because of some version mismatch or files are missing in his system. He is using the WS 2003 OS, and connectnow files are storing in c:/.../username/application data/MacroMedia/FlashPlayer/www.MAcroMedia.com/bin. I am using Windows Vista and for me connectnow is working and my idea is to send him the files what i have, because it's working for me. I got the above path from him and when i search for the files in that location in Vista it's not there, because the file system path for Vista is different. After spent some time on it around 7 minutes, i found below location where all the files belongs to the Connectnow.
I sent him the files i have and it's working great. May it will help others....
This is the path for it in Vista....
C:\Users\UserName\AppData\Roaming\Macromedia\Flash Player\www.macromedia.com\bin\acaddin Read More...

Detecting Buyer Roles in B2B Marketing

Understanding what role a buyer plays in the buying group is critical in effective B2B marketing. With a good understanding of what role each person plays, we are better able to cater our messaging, and our nurture marketing.

Each buying process is unique, and much as we can map the stages of the buying process for our products or services, we can also map the key roles in the buying process. These roles are likely to be very similar to the roles that your sales team has mapped out, just that they may be earlier in their investigation than when sales typically engages.

For each potential role in the buying group, we can define what the typical digital body language of a person in that role might look like. The specifics of this will of course depend on your buyers and what you have available on your web properties.

A generic set of buying roles might look like:

Economic Buyer/Decision Maker: This person is the gatekeeper to the budget and evaluates projects from an ROI perspective. Look at digital body language for viewing or searching for ROI focused case studies and calculators, vendor viability information such as investors or management team, and risk mitigation factors such as warranties.

Technical Evaluator: This buyer brings specific technical expertise to the buying situation, and evaluates projects on their technical merits and viability. Look for digital body language that indicates deeply technical investigation; product specifications, precise searches for highly technical information on your solutions, and activity on technical discussions and blogs.

User Buyer: The user buyer represents the users of your product or service. They are looking to understand its effect on their day to day lives, and as such may be seen looking at trials, demos, user documentation, or support sites in order to understand how they will operationally use the products.

Influencer/Coach: As a participant who is somwhat involved in the buying process, and/or highly supportive of your efforts, look for activities that suggest internal promotion of your ideas, such as frequent forwarding of content internally, referring of key internal stakeholders to your material, and searching for material that would bolster internal support for your offering.

Much as we looked a mapping web assets to stages of the buying process, we can look at mapping assets to roles in the buying process, and by doing so, gain a better understanding of what role each buyer plays in the buying process. With this, we can make our marketing efforts much more precisely targeted and effective. Read More...

Use of XML HTTP Request object to make server calls through javascript

We have a requirement that we need to show light box when you click on search results in search results page. When i was new to ASP.NET and know some what about ASP.NET AJAX i thought of using it, but it didn't work out well. Because search results returns large amount of data and if i put everything in UpdatePanel then it can't process the request because Ajax is meant for processing small amounts of data to and from server.

The main requirement is on search results page, we are showing a title, small description and read more link, when anyone clicks on the read more link, it will make a server call and get's the corresponding search results related data and show it up on the page in light box. It needs an ajax call. So, here ASP.NET Ajax won't work. After thought about 2 days i got a new idea and implemented that and working great. Everything was implemented with 15 lines of javascript code by using the XMLHttpRequest object.

How it works:
I have created an ASPX page, where it will return the HTML i need to show in the light box. From the XMLHttpRequest object, i will call that page, gets the response from it and show it in the light box. Very simple!!!
You can use handlers as well to do this. We can write some logic like depends on the querystring params, inside HTTPHandlers build HTML and return that html to browser. But for better styling and html formatting i used ASPX page instead of handlers. Finally, we will catch the response and bind it to the page.

ASPX CODE:








Explanation:
  1. I have taken a data list on my page, where it binds all the search results which matches the given keywords.
  2. I am using data bound event to bind the data on the server side, you can check that in the ASPX.CS section.
  3. I am using a ASP panel, to bind the response from the server and to show the light box. (This is tha panel we are using to bind the response from the server and show the light box.)
ASPX.CS:
protected void dlData_DataBound(object sender, DataListItemEventArgs e)
{
if (e.Item.ItemType == ListItemType.Header || e.Item.ItemType == ListItemType.Footer) return;

Literal litTitle = e.Item.FindControl("litTitle") as Literal;
string anchorText = "<a href="javascript:void();" onclick="\"javascript:loadurl('{0}','{2}');return false;\">{1}</a>";
litTitle.Text = String.Format(anchorText, "Path of the page", "Title", panelLightBox.ClientID);
}
Note: "Path of page" is the actual page we need to call, and "Title" is the anchor text.


Explanation:
In this event, you can get the server object and bind the data to the controls declared in the item template of the data list. Example purpose, i am binding data to only litTitle control.

Here, if you observe i am creating a html anchor tag and binding that to the literal control, it's not a good way, rather you can create a html anchor control with runat="server" in Item template and bind the data to it, any thing works. I am using onclick event to make a call to the server, in the onclick event of the HTML anchor control i am calling a javascript function called, "loadurl", which will make a server request through XMLHttpRequest obejct.
So, the process is,
  1. From our code, when you click on the title, it will call the loadurl javascript function.
  2. In loadurl function, we will create a xml http request object and sends the request to server.
  3. We will get response from the server and we will catch the responseText from it, and bind it to the light box control.
You can check the loadurl function below.

JAVASCRIPT:
var xmlhttp;
function loadurl(dest, parentID) {
try {
xmlhttp = window.XMLHttpRequest?new XMLHttpRequest(): new ActiveXObject("Microsoft.XMLHTTP");
} catch (e) {
}
xmlhttp.onreadystatechange = function(){triggered( parentID)};
//xmlhttp.setContentType("text/xml");
xmlhttp.open("GET", dest);
xmlhttp.send('');
}
var mainLightBoxDiv = null;
function triggered( parentID) {mainLightBoxDiv = parentID;
if ((xmlhttp.readyState == 4) && (xmlhttp.status == 200))
{
var div = document.getElementById(parentID);
div.innerHTML = xmlhttp.responseText.toString();

if (self.pageYOffset) {
yScroll = self.pageYOffset;
} else if (document.documentElement && document.documentElement.scrollTop){ // Explorer 6 Strict
yScroll = document.documentElement.scrollTop;
} else if (document.body) {// all other Explorers
yScroll = document.body.scrollTop;
}
var centerY = (yScroll + 170);
div.style.top = centerY+'px';
div.style.display="block";
}
}

function CloseDiv()
{
document.getElementById(mainLightBoxDiv).style.display = 'none';
}
Explanation:
  1. We are using two variables. One for the destination url, and one for the parentID which holds the id of the control for light box.
  2. We are creating an XMLHttpRequest object.
  3. When ready state changed, we are trigerring one event to process our request.
  4. Making the GET request to get the data from server.
  5. Binding the reponseText from the response to the parentID innerHTML.
  6. And some sort of logic to detect the y axis unit where we need to show light box, for this we are detecting the scrollbar position and setting the position of the division. i am doing some operation by adding/subtracting 170 - which is the minimum height of the light box in my scenario.
  7. We are using another function to close the light box.
These days, there are lot of technologies are coming and the best way of implement the above case is using JQuery. We cn use JTemplates to bind the search results data and make an ajax call to the page web method to get what we want. This is very simple and best.

For seeing it live or to test, you can see the page i developed for one of our client.

Very simple!!! Happy coding. Read More...

D&B: Digital Body Language Throughout Customer Lifecycle

A lot of our conversation focuses on new customer marketing, but for many or even most businesses, the successful retention of their existing customers is equally critical. I had a good conversation with Jeff Yee at D&B Canada while writing Digital Body Language, and this was his core focus. By leveraging insights into interest levels, usage profiles, and adoption, he was able to optimize their subscription renewal efforts. Enjoy the case study:




D&B: Digital Body Language Throughout Customer Lifecycle

D&B (Dun & Bradstreet) Canada, given their leadership position in business and credit information, wanted to focus on the end of year renewals for their customer base. With significant existing market share renewal and retention was just as important to them as new customer acquisition.

The initial project they undertook to achieve this goal was focused directly on the
renewal period. A progression of emails, triggered by an upcoming renewal date, was sent to the customer at 90, 60, and 30 days prior to renewal. Over the progression of communications, the tone would become increasingly clear in order to encourage the renewal process.

Watching the results, the D&B team noticed two trends. The first was that the customer response to the emails increased as the renewal date approached. Early communications had open rates of 33%, but as the renewal date approached, these rates would jump to 40% showing a significant uptick in interest.

The second trend noticed was a correlation between the renewal interest and direct usage of the D&B service. Adoption was a critical driver of renewal regardless of product line or industry. To continue to grow their adoption rates, D&B then turned to understanding the digital body language of their customers as expressed in their usage of the service. The service was well instrumented, and provided excellent marketing insight into overall usage and feature specific usage patterns for each user.

Conceptually, the team split the customer’s first 12 months of their lifecycle into 3 phases, adoption, usage, and renewal. With the renewal phase now fully automated, the marketing team is now focused on the other two phases. Leveraging their understanding of the prospect’s actual system usage, combined with their insight into the relationship between usage and renewal, a series of onboarding communications will ensure every new customer is quickly and seamlessly able to derive value from the service, while a series of tips and tricks emails will then work from an understanding of what features are and are not being used to suggest areas in which a customer can see even more value from the service.

By taking their understanding of digital body language beyond marketing and into the customer lifecycle, D&B is focused on ensuring that their customers’ renewal decision is based on a year of maximum success with their service.

Read More...

How to generate random numbers in t-sql?

How to generate random numbers in t-sql? Is there any built-in logic or keyword for Random? Yes, the answer is below.
DECLARE @mx int, @mn int;
SELECT @mx = MAX(EmployeeID) FROM Employee;
SELECT @mn = MIN(EmployeeID) FROM Employee;
SELECT EmployeeID,EmployeeName FROM Employee WHERE EmployeeID = ROUND(@mn + (RAND() * (@mx-@mn)),0);
RAND() is the function which gives you the random number between 0 and 1. So, depends on the minimum and maximum numbers, it will give you the random number. Hope this is what you are looking for. Read More...

How to write inner join in update query T-SQL

When we are very new to t-sql, we generally face a problem like how to make a query to update a table by doing inner join. if you don't know this, your queries become very complex. Below is the example how to write that.

Where it can be useful is,
for example, some how db admin created a table called State, where he is using the StateName as the foreign key instead of the StateID to employee table. Now he wants to update the State column in employee table with the StateID from StateName, we need this inner join in update query.
//Syntax:
Update Table1 set columnName = pt.columnName
from Table1 d
inner join Table2 pt on d.columnName = pt.ColumnName

//Example:
Update Employee set State = s.StateID
from Employee e
inner join State s on e.State = s.StateName
Read More...

How to changes column size and column name through t-sql

Alter a column size in a table:

Syntax: Alter Table TableName Alter Column ColumnName DataType(Size)

Example: Alter Table dbo.[AddressType] Alter Column [Address] nvarchar(512);

Change ColumnName, DataType in a Table:
In t-sql, if we want to change the column name and data type as well, then it's some what complex than any other alter statements, below explain the process of how to change the column name as well as the data type change.

Syntax:
--Changing the column name.
Alter Table TableName Alter Column ColumnName DataType NULL/NOT NULL

EXEC sp_rename //Stored Procedure.
@objname = ' TableName. OldColumnName’,
@newname = 'New ColumnName',
@objtype = 'COLUMN'

Example:
Alter Table AddressType Alter Column Address VARCHAR(512) NULL

EXEC sp_rename
@objname = 'AddressType.Address',
@newname = 'Address1',
@objtype = 'COLUMN'

How good it is? Did you find any best way other than this? Read More...

How to Create/Drop an index in a table through t-sql

Create an Index in a Table:

Syntax: CREATE INDEX [IndexName]
ON [dbOwner].[TabelName] ([ColumnName])
WITH ( ALLOW_PAGE_LOCKS = OFF)
ON [PRIMARY]

Example: CREATE INDEX [UQ_DateCreated]
ON [dbo].[Contact] ([DateCreated])
WITH ( ALLOW_PAGE_LOCKS = OFF)
ON [PRIMARY]

Drop an Index in a Table:

Syntax: DROP INDEX [IndexName] on dbOwner.TableName

Example: DROP INDEX [IX_Content] on dbo.Content Read More...

How to add a column and set primary key in a table through t-sql

Add a Column to a Table:

Syntax: ALTER TABLE TableName add columnName DataType NULL/NOT NULL

Example: ALTER TABLE Employee add IsDeleted bit NULL

Add a Primary Key To aTable:

Syntax: ALTER TABLE TableName add PRIMARY KEY(ColumnName)

Example: ALTER TABLE AddressType add PRIMARY KEY(AddressTypeID)

Note: PrimaryKey:Don’t Allow duplicates. Read More...

How to set Foreign key through sql script

Syntax:
ALTER TABLE TableName
ADD CONSTRAINT ConstraintName
FOREIGN KEY (ColumnInFirstTable)
REFERENCES ReferenceTable(ColumnName)

Example:
ALTER TABLE Employee
ADD CONSTRAINT Fk_Employee_AddressType
FOREIGN KEY (AddressTypeID)
REFERENCES AddressType (AddressTypeID)

Note: The column AddressTypeID should be a primary key in both tables. Read More...

Use of SET ANSI_NULLS ON in SQL SERVER stored procedures

As we discussed the use of NOCOUNT in previous post , we also use another statement with all the stored procedures in sql server. That is SET ANSI_NULLS ON/OFF

Which is useful when,
Specifies SQL-92 compliant behavior of the Equals (=) and Not Equal to (<>) comparison operators when used with null values.

Syntax: SET ANSI_NULLS {ON | OFF}
The SQL-92 standard requires that an equals (=) or not equal to (<>) comparison against a null value evaluates to FALSE.

When SET ANSI_NULLS is ON, a SELECT statement using WHERE column_name = NULL returns zero rows even if there are null values in column_name. A SELECT statement using WHERE column_name <> NULL returns zero rows even if there are non-null values in column_name.

When SET ANSI_NULLS is OFF, the Equals (=) and Not Equal To (<>) comparison operators do not follow the SQL-92 standard. A SELECT statement using WHERE column_name = NULL returns the rows with null values in column_name. A SELECT statement using WHERE column_name <> NULL returns the rows with non-null values in the column. Read More...

Need of SET NOCOUNT in SQL SERVER SPROCS

In every stored procedure we have define a common syntax is SET NOCOUNT ON/OFF. But most of us don't have any idea of why are we using it.
It is useful because of knowing the number of rows effected by some t-sql statement.

"Stops the message indicating the number of rows affected by a Transact-SQL statement from being returned as part of the results."

Syntax is: SET NOCOUNT {ON | OFF}

When SET NOCOUNT is ON, the count (indicating the number of rows affected by a Transact-SQL statement) is not returned. When SET NOCOUNT is OFF, the count is returned. Read More...

Debug SQL Server stored procedures in Visual Studio

We have a beautiful feature in Visual studio, where we can debug the stored procedures.
Follow steps below to debug SPROCS.

  1. Go to Server Explorer (Visual Studio -- View menu -- Server Explorer),
  2. Right click on Data Connections -- select Add Connection,
  3. Here type sql server name and select authentication type(windows or sqlserver), and then type your database. Click ok. Now you are connected to sql server.
  4. Select a SPROC and Right click on stored procedure -- select Step Into Stored Procedure.
  5. Give the default values to debug. Now it starts Debugging.
That's it!!! How simple it is...
Let me know your feedback on it, Read More...

MuteX could not be created error in ASP.NET

If you get error like MuteX could not be created then the problem is because of the so many versions of your site are in the temporary files. To fix it, do the following.
  1. If you have visual studio 2005 open, then close it
  2. Go to the ASP.NET temporary folder for v2.0 of the framework drive:\Windows\Microsoft.Net\Framework\v2.0\Temporary ASP.NET Files
  3. Remove the folder for your application. [If it is not deleting, then do IISRESET and delete it.]
  4. Reset IIS (on a command line window, >iisreset)
  5. Now, browse application(http://localhost/your app)
  6. Then reopen Visual studio, open your project in VS and build it.
  7. And it should work now.
All problems were resolved and no problems in running your applications. Hope this tip helped you out!!! Read More...

Register IIS with ASP.NEt framework

Some times we forget to install IIS before installing ASP.NET frame work. In these cases, we encounter different problems because of not registering IIS with ASP>NET frame work. Please follow below steps to register IIS.
  1. Open Command Prompt;
  2. Move to your (.net) version directory, like C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727>
  3. type registerIIS exe file with option i and enable it. (For Windows2003 you must give enable option,For Windows2000/XP no need just option I is enough.)
Example: C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727>aspnet_regiis.exe -i –enable

Now just run IISRESET once to set everything correct. Read More...

Special behaviour of Aggregate or Set operation functions(SQL SERVER)

If you are calculating the average of employe salaries from employe table,
The aggregate functions (example AVG()) : Null values are ignored.

Note: 1. Null value is eliminated by an aggregate or other SET operations.
2. For Count,Count_Big(For number of items in a group -- returns type of bigint.), they don’t ignore NULL or duplicates. Remaining aggregate functions ignores NULL values.

Take care about these aggregation while analyzing the values.... Read More...

Sql Server error message "don't allow remote connections"

If your Sqlserver gives error like "don't allow remote connections" etc….
Or error like
A connection was successfully established with the server, but then an error occurred during the login process. (provider: Named Pipes Provider, error: 0 - No process is on the other end of the pipe.)

Main cause for this error is, you are trying to connect to remote sqlserver, but on that server, remote connections option was disabled.

Do the following to solve the problem.
Start -- All Programs -- MicrosoftSqlServer2005 -- Configuration Tools -- SqlServer Surface Area Configuration -- Surface Area Configuration for Services and Connections -- choose your server(MSSQLServer or SQLEXPRESS) -- Database Engine -- Remote Connections -- Using both TCP/IP and namedpipes, click Apply then Ok.

Note: You must restart your server after change this configuration. Go to Administration Tools -- Services -- your server (MSSQLServer or SQLEXPRESS) -- restart/start. Read More...
Related Posts with Thumbnails
GiF Pictures, Images and Photos