Data Management Is as Sexy as a High Quality Mattress

I'm excited to have Tim Wilson from Gilligan on Data contribute today's guest post. Tim is one of the smartest guys on data management and data quality in the industry and brings a great perspective on what works in the real world. He also has one of the wittier writing styles out there, that makes his posts fun to read. I enjoyed this one, and I hope you do too.




=======================================


When Steve asked me to write a guest post about marketing automation and data quality, I couldn't resist, as we've been going back and forth on our respective blogs exploring the issue. It really started with Steve's Contact Washing Machine post late last year, which he followed up with in April of this year with a post about the need for that washing machine to be managed in-house, largely due to the diversity of sources of contact data. I added my own thoughts about the teeter-totter of customer data management a month later. That back and forth led to Steve thinking I might have a worthwhile direct contribution to his blog.

So, here it is:

Data management is like a mattress. It's not nearly as interesting as what gets done with it (on it)...but it's still awfully important!

The truth is, you can ignore the mattress and still get some interesting things done, but, eventually, as you wake up with a sore back, as you don't sleep well in the first place, and as you get shoved into awkward positions by pits and valleys...the interesting stuff just isn't going to be as interesting and effective.

Let's see how far we can push this analogy before it absolutely collapses under its own metaphorical weight.

Know What's Important about Your Mattress

Imagine the scenario: you're a spastic sleeper, flailing about on the calmest of nights; your significant other is a very light sleeper and wakes up at the slightest of touches. What's important? A mattress with enough room for you to roam about. That may be way more important to you than, say, the firmness of the mattress, which may be very important to someone with a chronically sore back.
It's easy to shoot for the stars with your contact data by trying to ensure that every contact attribute you capture is complete, accurate, and current. The problem is that shooting for a star is overly ambitious -- NASA is only now getting close to pulling that off for the first time. The same goes for your contact data. If you expect to have all of your data 100% clean, you will wind up with all of your data equally dirty, and it will hurt you. Prioritize your contact attributes so that you know what data is most important. The most important data will always be your core communication details: email address, mailing address (if you use direct mail as a communications channel), phone number, etc. After that, it really depends on your long-term marketing strategy -- focus on the data that matters most.

Start with a Good Mattress

Steve's contact washing machine is one example of this: at every point where you are capturing contact data, do what you can to capture it accurately. Be prepared to invest more -- in internal technology development as well as in third-party tools -- to ensure the highest accuracy of your most critical data. For instance, check that the e-mail address the prospect provides is well-formed. If the mailing address is a high priority, then, for U.S. addresses, consider validating the address provided against a CASS-certification tool. Build in other logical checks -- can the user put in that they have 5,000 employees at their company but have annual revenue of less than $1 million of revenue?

Be careful: it can be tempting to build in all sorts of logic to check that you are capturing good information, but that can be risky for two reasons:




  • Faulty logic in your checking -- we've all been to a web site at one time or another that tells us we've entered something incorrectly...when we haven't. I've been on the inside of a company that had this happening with one of their most highly-trafficked lead acquisition points. It's not pretty. It's better to get 95% perfect data quality and have 100% of the visitors to your site get to the information they want than to have 99% data quality and 10% of your visitors getting caught in an endless (flawed) validation loop that leads them to give up and leave (with a bad taste in their mouth about your company).


  • Losing sight of your priorities -- have you ever been to a web registration form with the "Red asterisks denote required fields" note...and then every field has a red asterisk? This is bad. Yes, you want your data as clean as possible, but you want the data that is most important to really be clean. Prioritization sucks, but you've got to do it.



Flip Your Mattress

"Will everyone in the room who has flipped their mattress in the past six months as per the manufacturer's instructions please stand up? Wow. There's one guy. Usually no one stands up when I ask that question. Oh. He's just taking a call on his cell phone."

Data management cannot stop at the point that you've got your data capture mechanisms set up. This is where the mattress analogy breaks down a bit, as ensuring that you are constantly working on the quality of your data is wayyyy more important than your mattress-flipping schedule.

Here's the contact data-equivalent mental exercise to the mattress-flipping survey above:




  • How many people are in your department at work? How many of those people joined the department in the last year? How many people were in the department a year ago and are not any longer? How many people have had a change in job title or responsibilities in the last year? Given your answers to these questions, roughly speaking: what percentage of your department has had key attributes of their contact profiles change in the last year? 10%? 20%? More?


  • Now look at your database. What percentage of your contacts have had no updates to their key profile data in the last year?



Do you see where this is heading?

The point: we tend to be wildly optimistic about the quality of our contact data, because we underestimate how rapidly that data decays. We assume that the rest of the business world is more static than our own immediate environment.

This is where marketing automation, and your overall marketing program, really start to show their symbiotic relationship with the management of your contact data. All too often, we live with some cognitive dissonance, in that, when we talk about the quality of our customer data, or when we manually inspect a handful of records, we quickly realize that much of the data is old or incomplete. We then turn around and build automated marketing programs that pretend the data is perfect. We reconcile this by telling ourselves that it's the best data we have, it's better than nothing, and there's nothing we can do about it. This is not true.

While there is no magical, easy way to maintain your customer data quality on an on-going basis, you do have opportunities in many of your marketing activities to fight off the beast of data decay:





  • When known users hit a registration form on your web site, prepopulate it with the data you have about them and include a simple note asking that they confirm the accuracy of the information before submitting the form


  • Alternatively, or in conjunction with the above, add a persistent element throughout your web site that shows the 3-5 most critical fields about the visitor with a clear "Update my information" link


  • In direct mail and direct e-mail campaigns, include the explicit information (including information you have determined based on implicit/behavioral data, when applicable) about the person, with a secondary call to action for them to update that information. (For four years in a prior role I regularly received direct mail from Microsoft targeted to me because I was an "IT executive" who, apparently, had responsibility for IT infrastructure -- if there had been a way for me to tell them I was woefully misflagged in their database, I would have done so.)


  • Factor in the "last updated" date for the contacts when developing your promotional lists. You may already be running some form of reengagement program on old leads -- don't assume that the job title or role is remotely accurate for these contacts. If this program includes a, "We haven't heard from you in a while" component, a non-aggressive tactic can be to ask them to update their information and interests so that you will not bother them with information in the future that is not useful to them.


  • Don't assume that the humans in your company are thinking of data quality when they have direct interactions. Do some digging into your telemarketing and inside sales processes to ensure that they include steps to check for the currency and accuracy of the key data points when they interact with leads directly.


In short, flipping your contact data mattress is not something you can do with a few simple steps on a bi-annual basis. It really needs to be an on-going process that is embedded in small ways throughout your marketing programs, always keeping in mind that the burden on the contact himself/herself needs to be kept to an absolute minimum.

Sleep Well!

At the end of the day, you want your contact data to be as accurate as possible so you can drive more sales. A better mindset, though, is to recognize that "more sales" is the end, and the means to that end is "provide more value to your leads by better understanding their wants and needs." In other words, contact data management is about being customer-centric first, which will lead to improvements in your lead qualification process, which will improve the handoff of leads to Sales, which will lead to higher revenue...and a good night's sleep!

Read More...

Copy data from MS Office Project to MS Word

We usually work on MS Office project when we deal with project management, project estimation etc…. So I used it to create project estimation document for forwarding that to management and client. But unfortunately, Client don't have MS Office Project installed. So I need to send him in either Excel or Word format.

I planned to copy and paste the data from MS Office project to MS word, when I did that it came like unformatted text. It's not looking good at all. I can't manually format it, Because it is a very big document.

After spending some time on it, found a solution how to do that. Please follow the process below.

    • Select the information pasted into Microsoft Word from Microsoft Project.
    • On the Table menu, click Convert, and then click Text to Table.
    • Under Separate Text At, click Tabs, and then click OK.

That’s it!!! This trick did everything like well formatted and rendered in table. It saved lot of time for me.

Hope this helps you too!! You can get the more details about it here.

Read More...

The castle...sortof.

This past weekend we went to see the castle. Schloss Neidstein (schloss means castle) is only located 30 minutes from us. It was recently owned by Nicolas Cage (he sold it in april 09) We were so excited to see this:



So imagine our surprise when instead we saw this:



Hmmmm...big difference there...so we asked around, "Wo ist der schloss?" All hands pointed right to the big red building that maddie described as "It looks like a barn." We just happened to be there on the same day as the town's blumenfest - or garden walk. Town residents had opened up their gardens for people to walk through and admire. We walked through a few and enjoyed the beautiful grounds and then went to the castle. A pleasant surprise...they were doing tours!


Eric in front of a bee hive building...bees were buzzing all over the place.


Maddie in front of a beautiful rose bush





So inside the castle we went...still wondering why google images displayed such an impressive castle while the one we were in was anything but... We did see some incredibly beautiful things...








Eric was just a little disappointed...he thought this castle was boring. "I expected more armour and axes...there were only one of each!" John especially loved the door handles and locks. I loved the twists and turns and hallways that led to nowhere. Maddie loved the trinkets and knicknacks that she found all over in every nook and cranny and bella loved matching one of the fireplaces...it was as green as her sweater!

When we got home I looked up Schloss Neidstein again on google...oops...you mean its one more town down the road? So, we went to a castle...but it wasn't THE castle. I think we may try to get to THE castle again this week. Read More...

Off the first medication...

We had our first appointment with madmad's new doctor here...Dr. K is a child psychologist. He agrees with us...we should take her off the medications to see if environment will be the key to her behavior issues.

Hoorah!

He gave us a huge vote of confidence. After meeting alone with her he said, "Well, she's definitely in the right place." It was good to hear that.

She is currently on two different medications. The first is to help her concentrate and the second is to help her aggression. We have now officially taken her off the first medication. Its been 24 hours and according to Dr. K that is how long it takes to leave the system...so Maddie is now medication 1 free!

So far its been interesting...I haven't seen anything in her behavior that she wouldn't normally do... she's actually a little somber and sullen today.

Medication 2 will be done in a couple of weeks. Dr. K wants to see how she does without medication 1 first.

Keep your fingers crossed...ultimately we want what is best for her and I'm hoping this is it! Read More...

How to submit all posts to Google webmasters for better indexing all the posts in my blog?

You all know that the Blogger’s atom.xml file will burn only the top 25 results in the feed. This is the file usually we will submit to the Google webmaster tool to index our blog posts. But because it only returns top 25, Google crawler indexes only these 25 and it don’t care about the posts which are older than 25. So, there you are loosing your visitors to see those posts because they won’t be show up in the search results  because of not crawled well. So how to solve this issue? and how to submit all posts to Google webmasters?

Solution:

Before you proceed, please check the below posts once for the better understanding.

  • How to submit your blogger blog site feed to Google webmasters if you enable the option Post Feed redirection url. View here.
  • How to see all posts in the atom.xml feed instead of getting only top 25? View here.

Once you read above posts you will get complete idea of how to solve the problem.

  • Open Google webmaster site and submit the below url
  1. For submitting the top 50 blog feed results then use

   atom.xml?redirect=false&start-index=1&max-results=50

  1. For submitting from 50 to 100 results then use

         atom.xml?redirect=false&start-index=50&max-results=100

So you can submit all the posts to Google webmasters to crawl all the posts instead of only top 25. How good it is?

Read More...

Bloggers feed is showing only 25 posts, how to get all?

This is the question in my mind from long time that why feed is not showing all the posts when I open atom.xml. It always get me the top 25 posts in the feed. So how to get all the posts?

Solution:

  • We need to use certain querystring parameters to solve the issue to get all the posts.
  1. start-index : Which tells to the blogger feed generator that starts from this post.
  2. max-results : Which tells to the blogger feed generator that the number of maximum results count from the start index.
  3. redirect : Which is not actually needed. If you enabled post feed redirection then only we need this. But, better to keep this always in the feed url. You can find the more information about it here.

How to use:

After the atom.xml feed url, append the below querystrng url.

?redirect=false&start-index=1&max-results=50

For testing you can browse my feed http://praveenbattula.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=50

Which will burn you the top 50 posts in the feed. Actually we get top 25 only by default, but from the solution I told you, which will return how many posts you want.

For getting top 100 results use this querystring url

?redirect=false&start-index=1&max-results=100

For getting the results between 50 to 100, use this

?redirect=false&start-index=50&max-results=100

You can give any value to the parameters start-index and max-results to get that many results…

Happy Blogging.

Read More...

How to submit blogspot feed to Google webmaster where feedburner feed is enabled in blogspot blog

I enabled the option Post Feed redirection url in my blog settings. From the time I did that Google webmaster tool is started giving the error "URL not allowed". But I need to enable the post feed redirection to Feedburner instead of general atom.xml feed. So how to solve this problem? After read the Google blog documentation found some attributes how to solve this issue.

For my blog the atom/feed url is http://praveenbattula.blogspot.com/atom.xml. When I type this, it will redirect to my feedburner url http://feeds2.feedburner.com/praveenbattula. So when I submit the atom.xml in the Google webmaster it’s not taking this and gives me the error "URL not allowed" because of the redirection. So finally I understood that redirection is the problem.

If you want to use custom redirected feed like FeedBurner then you need to use the "redirect" querystring parameter to the url of your feed, which stops the redirection.

atom.xml?redirect=false

So, when I use the url http://praveenbattula.blogspot.com/atom.xml?redirect=false i always see the message "successfully crawled" in Google webmaster tool.

So except in Google webmaster tool I don’t use the redirect querystring. So, all users will see my customized feedburner feed, only webmaster crawler see my atom.xml feed. How nice it is….

Nice tip!!!!

Read More...

Database Publishing Wizard – Generate Script for Schema and Data

When working with databases we may get a requirement to generate the script for the schema and sometimes generate the script for data too. For generating script for Schema is what we will do usually in the projects, but generating the script for data is not everyone requirement. But sometimes we need to do that as well. So this article explains both generating the script for Schema and data.

There is a cool feature from Microsoft sql server team which will do this for us, called Database Publishing Wizard. It has the commands which supports for generating scripts for schema as well as data.

You can download it here.

http://www.microsoft.com/downloads/details.aspx?FamilyId=56E5B1C5-BF17-42E0-A410-371A838E570A&displaylang=en

Examples: 

Command to run which will create schema and database:
C:\Program Files\Microsoft SQL Server\90\Tools\Publishing\sqlpubwiz script -d AdventureWorks “C:\AdventureWorks.sql”

Command to run which will create schema:
C:\Program Files\Microsoft SQL Server\90\Tools\Publishing\sqlpubwiz script -d AdventureWorks “C:\AdventureWorks.sql” -schemaonly

Command to run which will create data:
C:\Program Files\Microsoft SQL Server\90\Tools\Publishing\sqlpubwiz script -d AdventureWorks “C:\AdventureWorks.sql” -dataonly

Note: Don't try it on the databases which has the size more than 100MB, it will hang your computer for a while.

Read More...

Microsoft SharePoint server 2007 Service Pack 2 update

We know the expiration issue of SharePoint server after install of Service pack 2. Now Microsoft has been working to fix it from past 2 months and finally they released an update and it is ready for download. You can install it on both the Server with SP1 or the Server with SP2 installed. You can see more details here.

So, what are you waiting for? Just download it and install on the server.

Read More...

Toe sucker

Read More...

I always wanted a pet...

The 24 giant snails that are now "pets" and reside in the backyard fort.


Me: Don't they get away at night while you guys are asleep?
Eric: Oh, don't worry. They go really slow.
Read More...

I just wanted to watch her play...

It looks like so much fun.








(mom...have I told you lately that I'm still in shock and awe that you survived SEVEN of us?) Read More...

Nasty Gram is missing...

After Maddie's summer haircut, Maddie asked me to forward photos to Cruella and Cruella's mother...so I did. I figured we were in a new era of "friendliness" now that the custody arrangement is final so I could do that small favor.

Oh...did I forget to mention that? THE CUSTODY ARRANGEMENT FOR MADELINE IS FINAL! Woooooooooooohoooooooo!

Well, apparently a nasty gram was sent back to me because Cruella had previously told us not to cut her hair. Sorry. You don't get to make that decision anymore. muwahahahaha...

Just let me bask in the irony...

Anywho...I get an email back "apologizing" for the last email. It said something along the lines of - I reread my last email and realized I may have come across as sounding angry. Now that the shock of her hair being gone has worn off I wanted to make sure we were clear that I didn't mean all that. Oh yeah...and can you please send me a bunch of photos of the kids?

Hmmmmmmm.....I didn't get that nasty gram email...but I have to laugh because I know now that one was sent and I know now the gist of what was said. And of course she had to apologize because she wanted a favor from me.

So, just to be a little catty...mostly because I can and I'm in the mood for that...the next post will be all those photos she wishes she had. Hmpfh... Read More...

Is Time-of-Day Sending Overhyped?

I can't recall the last time I waited in my inbox at 10:01 on a Tuesday clicking the "Send/Receive" button repeatedly eagerly anticipating the next edition of my favourite corporate newsletter. I'm not particularly bothered by what times emails arrive, as a recipient. I strongly doubt that I'm alone in that.

So why is the email industry enamoured by time-of-day sending optimization? I suspect it comes down to a combination of three factors:

1) Marketers are dying to find ways to improve their effectiveness at connecting with their audiences

2) As email service providers, we can easily build time-of-day sending control into our systems, and it seems like a compelling and simple answer to marketer's needs

3) Tactical metrics like open rates may even show a swing across emails sent at different times of the day, "proving" the effectiveness of this technique


However, this is avoiding the problem. The only true, long term way to better engage with an audience is to repeatedly deliver content that is interesting and relevant to their interests at the particular stage of a buying process they are in. This is not easy. As marketers, we need to work to understand the stage in the buying process each buyer is at and deliver content that is relevant to them. Unfortunately, it is much more difficult to understand buyer interests and needs than it is to simply time an email campaign.

So why does some of the data appear to show a real difference in effectiveness of email campaigns depending on when they were sent?

Much of that comes down to how email is handled in various situation and how that affects the measurement of data. Remember that measurements such as open rates are far from 100% accurate as they rely on the rendering of an image in the email to indicate that it has been opened.

If, for example, an email is sent to me before or during my commute to work, I'm likely to open it on my Blackberry. Images are not rendered and it does not show up as an "open". The effectiveness of the email has not changed, just our way of measuring it.

Time-of-day sending can be very relevant in certain situations, like with media types outside of email, such as voice or SMS, or if an email is being sent on behalf of a sales person, and would seem strange to appear at 2am. However avoiding the challenge of delivering relevant, compelling content in order to focus on time-of-day sending is spending effort in the wrong area.

I look forward to your comments. Are there situations where you have found time-of-day sending highly relevant? Read More...

SQL Server 2008 Developer Training Kit

The SQL Server 2008 Developer Training Kit will help you understand how to build web applications which deeply exploit the rich data types, programming models and new development paradigms in SQL Server 2008. Which has plenty of demos like

  • Spatial Types Demo
  • Intro to Filestream Demo
  • SQL CLR Nullable Types Demo
  • Programming with Filestream Demo
  • Reporting Services Web Application Integration Demo
  • Date and Time Support in SQL Server 2008 Demo

and many more…. It is really very cool one. It will give you lot of features or options to implement more to add your own stuff. You can download it here.

Read More...

Windows 7 RC training kit on Visual Studio 2008

Hi,

Again I came up with Windows 7 blog post. I am waiting for Windows 7 release and here is a post which is for letting you know that we can customize Windows 7 system depending on our choice. We can get the toolkit and with the help of Visual Studio 2008 we will develop the modules and deploy it to Windows 7. This is easy to customize. For this you need to install the Windows 7 RC training kit. You can download it here.

What it contains? It has the demos on how to do and examples for different modules and plenty of presentations, which explains the developers how to proceed. This is the one of the easiest way to understand things in little time. You can customize lot of areas of Windows 7 and below are the main modules.

  • Taskbar
  • Libraries
  • Multi Touch
  • Sensors and Location
  • Ribbon
  • Trigger Start Services,
  • Instrumentation and ETW
  • Application Compatibility

You need to install Windows 7 SDK before start working on it. So what are you waiting for? Common download and rock on it!!!

Read More...

Oh to be seven...

I cooked pork and beans for dinner.

Maddie asked how I made such a delicious dish. I replied, "I opened a can."

Her response, "Can you teach me how to do that?" Read More...

Locks of Love...

Maddie decided to donate her hair to Locks of Love... She was going to do it during a school assembly last year, but chickened out. We talked it over here and decided that it would be a good idea for a summer haircut that was easy to take care of. Maddie was nervous, but now she loves it!

Before...


After! Twelve inches of ponytail being sent to Locks of Love.


Aaaaaah! Put it back on!


Tell me how cute this is!


The back of my new "do"...its layered because of where we had to cut the pony tail.
Read More...

The Goals of Lead Nurturing

One of the most common ways to use a marketing automation system is for lead nurturing. Also called “drip marketing”, “nurture marketing” or various other names, this is the art and science of keeping prospects “warm” until such time as they are ready to buy. At that level, there seems to be general agreement that it’s a great process to put in place. Similarly, the results are clearly showing that there is tremendous value in nurturing leads. However, there is often something of a lack of consensus on what the approach should be for nurturing leads.

At a high level, I would describe the goals of lead nurturing as three things, in order:

1) Maintain permission to stay in contact with the prospect: This is by far the most important goal of lead nurturing, and one that is most often overlooked. If a prospect emotionally unsubscribes, you have lost your connection with them, and you may in fact be marked as spam.


2) Establish key ideas, thoughts, or comparison points through education: A prospect you are nurturing may not enter a buying process for many months, if not quarters. However, if you can educate prospects, and by doing so, guide their thinking slightly to incorporate key requirements and ways of analyzing the market, when they do become buyers, you will be much better positioned


3) Watch for signs of progress through the buying cycle: As you nurture prospects, you can watch their digital body language to give you an understanding of when they are moving to a new stage of their buying process

Maintaining permission to stay in contact with the prospect is, as mentioned, the most critical aspect. As we’ve seen, unsubscribe rates, as measured by explicit clicks on your unsubscribe link can be very deceptive. Many recipients will emotionally unsubscribe instead. In order to manage your lead nurturing processes successfully, you need to ensure that you pay close attention to the engagement level of your audience through watching their response activity and manage the frequency with which you communicate with them in the lead nurturing process accordingly.

The need to maintain your audience’s permission to stay in contact with them is the key driver of why high quality, valuable, non-salesy content is crucial to your nurture strategy. However, that content can also guide thought processes and decision criteria. Often, buying decisions are influenced by how buyers think about the market, what “fault lines” they see as crucial in comparing vendors, and what they believe to be possible in the market. Depending on the buying process challenge you face, you can use this content marketing opportunity to educate buyers on what is possible, or on key buying criteria they may not have considered.

If you are successful in the first two goals, you can then begin to look for progress along a buying process through watching your buyers’ digital body language. You can include “teaser” content, such as content from an RSS feed and watch that for signs of interest in a topic that indicates direct buying interest, or can establish interim actions for prospects to take, such as signing up for a webinar or trial, that would indicate a deeper buying interest.

Lead nurturing can be a very powerful way to stay engaged with future potential prospects, and in doing so, successfully establish buyer preference and understand buyer timing. However, it only allows you to accomplish this if you are careful to maintain your audience’s permission to remain in contact with them. Read More...

What Exactly IS Digital Body Language?

I've been using the term "Digital Body Language" on this blog quite a lot for obvious reasons. However, I have not really taken a moment to define the term, as I realized recently after a presentation on the topic. An audience member came up to me and very hesitantly asked "I think I get the overall concept, but what exactly IS digital body language." It was a very fair reminder to me to not get so caught up in a topic that the basic concepts are overlooked.

So what is it?

What we are referring to when we talk about Digital Body Language is the aggregate of all the digital activity you see from an individual. Each email that is opened or clicked, each web visit, each form, each search on Google, each referral from a social media property, and each webinar attended are part of the prospect's digital body language.

In the same way that body language, as read by a sales person managing a deal, is an amalgamation of facial expressions, body posture, eye motions, and many other small details, digital body language is the amalgamation of all digital touchpoints.

The best physical representation of digital body language that I can think of is as seen through Eloqua's Prospect Profiler tool (pictured), which shows the spikes and valleys of activity, across all digital media, from web and email to search and forms. You can see how a perspective of what is happening with that individual can be gained from seeing this amalgamated digital insight.

However, the raw information that digital body language provides is often only the foundation. Much as each facial muscle contributes to our reading of a person's body language, the raw digital information is mainly of use when looked at through the lens of lead scoring to understand whether an individual is ready for sales, or what buyer role they play in the process.

Hopefully this clarifies what Digital Body Language is. As always, I welcome your questions or comments. Read More...

Knowing browser width and height [for all browsers]

Today, at my work I need to implement java script for a SharePoint web page. First I developed the javascript in simple HTML page and after it is successful running I moved the code to SharePoint page.

In html page, it was working very fine and on the SharePoint page it was not. After some research I found the problem in the line document.documentElement.clientWidth and document.documentElement.clientHeight. I don’t know what the problem with this. I was browsing both HTML page and SharePoint page in the same browser. This behavior is weird.

Solution:

I researched on the javascript functions and read all the properties available for the document object and below is the code I came up with.

function GetWindowProps() {
var browserWidth = 0, browserHeight = 0;
//For checking non-IE browsers Mozilla, Safari, Opera, Chrome.
if (typeof (window.innerWidth) == 'number') {
browserWidth = window.innerWidth;
browserHeight = window.innerHeight;
}
//All IE except version 4
else if (document.documentElement && (document.documentElement.clientWidth || document.documentElement.clientHeight)) {
browserWidth = document.documentElement.clientWidth;
browserHeight = document.documentElement.clientHeight;
}
//IE 4
else if (document.body && (document.body.clientWidth || document.body.clientHeight)) {
browserWidth = document.body.clientWidth;
browserHeight = document.body.clientHeight;
}
}

This will give you the correct values and it will work for any browser. How is it?

Read More...

An apple a day...

We just had Bella's 6 month well-baby appt...yes, I know she is 7 months old...I'm just a slowpoke procrastinator because I hate those darn immunizations...

We have a new Dr...well, duh, of course we do...we're in Germany now... Dr. Zoesh (pronounced Zesh) prefers to just be Dr. Z. He's a regular comedian and a great pediatrician. Bella took right to him...screamed and cried the entire appt. He said it didn't hurt his feelings...he had so much rejection in junior high and high school he is immune to it now.

Our little girl is soooo tiny! Dr. Z is concerned because her weight continues to move further down the % charts...she's now barely over 1% for weight (13.2 pounds today!), but he also said its not too serious because she has great skeletal growth (25% for height (26 inches!) and 75% for head).

He told me to find the healthiest kind I could and start feeding her yogurt to make her fatten up a little. I thought dairy was off limits until after age 1, but after a little research learned that yogurt and cheese are ok? I find this a great excuse to be able to go searching for delicious yogurts...can't wait for this shopping trip!

She also had her shots today...poor little thing..

Grow baby girl...grow!

Maddie and Eric both had Dr. visits today too. We just moved them under our insurance so this was their initial evaluation. Their doctor said they are too skinny...then she heard what medication they'd been on, rolled her eyes and said, "well, no wonder." HA! I think I'm really going to like her.

We've got a referral to have Madeline evaluated by a psychiatrist here and so far the Dr is on board with trying to get her off the meds she is on now. We found out today that one of them is what they currently prescribe for PTSD psychotic episodes and bipolar disorders. Aiya...for a 7 year old! I know there is a better way...fingers crossed, prayers said, good vibes...send them all Maddie girl's way! Read More...

Sybase: Buyer Profiling for Micro Segment

We all know that it is often best to engage with our markets by segment. However, in some industries, this can be extremely challenging as the segments can be so small that they are extremely difficult to target. Sybase faced this challenge, as one of their products was of most interest to an extremely small audience - the "data elite".

To tackle the challenge, the Sybase team catered to the data elite's natural competitiveness - Peter Kim would call it an "ego trap", but regardless of what you call it, it's great marketing.

Enjoy the case study, it's from Digital Body Language:


Sybase: Buyer Profiling for Micro Segment

For one of its key data-management products, Sybase IQ, Sybase needed to engage with a specific set of its customers: “the data elite” – people who needed fast response times in a solution to tackle extremely high volumes of data. The Sybase IQ product leveraged a new approach to data storage and querying that resulted in performance improvements of many orders of magnitude. The target buyers, however, in many cases were not aware that such a solution was possible, and may have been grudgingly purchasing ever larger hardware in order to tackle the problem.

To connect with this audience, the Sybase team leveraged the naturally competitive nature of administrators of huge volumes of data, and their desire to compare themselves against their peers. The campaign targeted a scrubbed list of existing Sybase contacts and asked them for information on the extreme challenges they were tackling - data volume, response time, or both. Based on their answers, one of three cartoon icons guided them through an information-gathering process where they were ranked as a Pro, an Expert, or Elite by comparing them to their peers.

With this basic knowledge, the campaign guided them through five stages - from collecting basic information through to fully engaged, through sharing thought leadership from industry gurus and case studies of similar professionals becoming corporate heroes through delivering massive performance increases. At each step, the content and detailed information provided was tightly matched to the individual’s biggest challenge and rank. By observing their interactions with available content, the campaign transitioned the customer from one buying stage to the next.

By cultivating that competitive spirit among database experts as to who tackles the larger data challenge, Sybase engaged with the “data elite” in ways that enabled the company to better understand who would be an ideal audience for the product. By catering to this competitive spirit, Sybase was also able to develop the opportunity to present to them possible solutions, that they had never thought possible, to a very real challenge they were having.

Read More...

Sql server management studio for sql express

This is the question from most of the people around me and the forums etc asked. When once they completed installing Visual Studio or directly SQL Express on their machines, always they are failing to open the database in sql server management studio because when they go to all programs, sql server they didn’t find sql server management studio there. So, how to see all the databases or to create one?

By default sql server management studio won't come with the sql server management studio installation. So, we need to install sql server management studio express manually for viewing the database or creating one. You can download the management studio express here.

Isn't a nice tip. Feel free to post a comment.

Will post more like this. Keep an eye on my blog!!!

Read More...

Eric's summer haircut

Read More...

Baby girl is 7 months old today!

It was a beautiful day for Bella's 7 month birthday. Maddie picked a flower for her hair and we sat out in our backyard for the photo shoot...



And today was the magic day...she SITS! Woohoo...well, we still have a tumbling over problem, but she managed to stay up for at least 20-30 seconds by herself.
Read More...

Ain't it the truth?

Read More...

Is Data Quality the "New Black"?


Anytime I talk about data quality with a marketer, I always get the answer “yes that’s really important, but i don’t know where to start as we have so many problems and we don’t have the resources”. Well I believe that now it is more important than ever to implement a data quality plan, as the success of your campaigns depends on it. In fact it is so important, that I believe data quality will be the “new black” for this season of marketing campaigns.
We have found that customers that focus on data quality generate 267% more leads that those who don’t.

Why would that be? Quality data drives your segmentation and targeting, personalization and more accurate lead scores. All of these things help deliver higher quality leads to your sales team.

Let me walk you through the top 3 things you should do to maximize data quality:

  1. Identify the sources of all of your new data and prioritize the quality level of data from each of those sources:
    a. Your CRM system may be top priority
    b. But a list from a new sales rep may be lower priority

  2. Standardize the fields and values you are getting from those sources – whether it is fields on a form, or the information you are capturing at a trade show
  3. Finally put a system in place that cleanses new data to a minimum standard, “inline” as new contacts are added to your system – this is the critical part of the solution. Steve wrote a great article on the inline data cleansing concept or contact washing machine in April.

With these three steps you will ensure to be in vogue with this season’s marketing campaigns.

Read More...

How to see actual exceptions or errors in SharePoint instead of Unknown errors message

When I was new to SharePoint programming, I faced plenty of problems to find out what errors or exceptions coming on a SharePoint web page. Because as of my knowledge those days, I thought like there is only one place where we can find the details of the exception or error coming and that is nothing but logs. [Which usually located in 12 hive.] But after got some experience with SharePoint slowly learning new things and found a way to see the exception directly on the screen[browser] instead of going to logs and check there.

When an exception comes in SharePoint environment usually the page we see is Unknown error, nothing else. 

To see the actual exception details follow the steps below.

  • Go to the location where your SharePoint application is present in the file system. It is usually at the location inetpub/wss/virtual directories/[port number].
  • Find the web.config file.
  • Take a backup of it.
  • Open the file in some nice editor like Visual studio and find the tag <SafeMode
  • This is the first child tag under <SharePoint>
  • You can see the tag syntax something similar to this.

     <SafeMode MaxControls="200" CallStack="false" DirectFileDependencies="10" …….

  • One of the attributes for <SafeMode called CallStack, which holds the boolean value which tells to SharePoint framework whether to show the actual exception details or not.
  • Set this value to true in-order to view the actual exception details on screen[browser].
  • Set the CustomErrors section to Off to view complete details as shown below.
  • <customErrors mode="Off" />
  • IISRESET is not compulsory. If you do then that will be good.

Hope this will help to you to make fast development and no such big pains to look into all the log files for errors.

Note: Please use this option only on the development environment to make development fast. On production servers and QA environment, don't make the above changes to the file. If you do this, all your end users will see the actual exception or error details, which is not good.

Read More...

Data Quality: Balancing the Customer Experience

I was in a conversation recently with Tim Wilson from Gilligan on Data about the balance between the client experience and data quality when it comes to semi-standard data like title or industry. On one side of the spectrum, the best user experience is often free-form text. Forcing a user to select from a defined set of choices often leads to a frustrating experience. A short list of titles, for example, will often be missing a good match for the visitor’s title, and lead to a poor selection. A longer list forces the user to select from many, many options, and impacts their ability to quickly use the form.

However, on the opposite side of the spectrum, demand generation relies on clean data. Rules for such activities as segmentation, lead scoring, and lead routing may be built on such data fields as title or industry. Personalized content rules might select a piece of content based on visitor data, and analytics may present results that build off of the underlying data. In all cases, having clean data is critical to the success of these initiatives.

So, how do we balance the requirement for the best possible visitor experience with the need for cleansed data to work with within our marketing database? The answer is through using secondary data fields for standardized data. The user is allowed to input free-form data on the web form, which provides them with an optimal user experience.

As the form is submitted, this data is fed into an inline data cleansing system (such as a contact washing machine) to scrub the data. The free-form data is compared against a standard list of titles in the contact washing machine. Because this step is automated, and not part of the user’s experience, the size of the list of titles used does not matter, and accuracy does not have to be sacrificed.

However, when a match is made, the resulting data can be fed into a secondary field, rather than back into the original field, leaving the user’s free-form data intact. In many cases, it may be useful to feed the data into more than one field. For example, when looking at a visitor’s title, it may be useful to split it into a “level” component (Vice Presidente, C-level, Manager, Director), and a “department” component (sales, marketing, finance, human resources).

As an example:
  • User Inputs: "V.P. Marketing"
which is then split into three data fields:
  • Raw Title is Maintained as "V.P. Marketing"
  • Level is Standardized as "Vice President"
  • Area is Standardized as "Marketing"

The personalization, scoring, segmentation, and routing rules that are needed can be built on the cleansed and standardized data, giving maximum accuracy and ease of use to the marketer. At the same time, the visitor is able to submit free-form data, which provides them with an excellent user experience.

Read More...

ASP.NET FileUpload and File.Open() method problems

In this post I tried to explain the problems using File.Open() method when we use FileUpload control on ASPX form.

Today at my work, I faced plenty of problems with ASP FileUpload control in SharePoint. I was creating a custom form according to a client requirement. And whenever I tried to upload a file to the document library it was failing always.
The exception I am getting was: Could not find a part of the path "file upload path."

I was frustrated trying different things for 20-30 minutes and finally thought of checking the code again for a clue. Below are the details from my analysis.
I used the following code to get the uploaded file as a stream and save it to a SharePoint document library.

using (FileStream fs = File.Open(fileName, FileMode.Open))
{
    //Sharepoint programming - Adding document to a document library.
SPFile destfile = folder.Files.Add(fileName.Substring(fileName.LastIndexOf("\\") + 1), fs, true);
    //your logic here
}

fileName is a variable passed to File.Open() method that holds the file system path of the file from the file upload control.
The above code was trying to find the file in the given path on the server. If you correctly read the above sentence, it was looking for the file on the server with the given path.

Take this scenario. I was working on machine1 which is a development environment (my MOSS server) and in which my application is also running. I was testing the application from the other machine named machine2 (client). When I try to upload a document from machine2 from the path say c:\documents\abc.docx, The server code [C#] File.Open() was looking for that same path c:\documents\abc.docx on the server, i.e. on machine1. See, there was the problem. The code was trying to find the file at the path c:\documents\abc.docx on the machine1, which doesn’t exist. This is the problem with the File.Open() method in file upload control. So never use this in the code implementation in c# when you are dealing with the FileUpload.

Here is a Solution:
using(Stream fs = fileUpload.PostedFile.InputStream)
{
    //Sharepoint programming - Adding document to a document library.
SPFile destfile = folder.Files.Add(fileName.Substring(fileName.LastIndexOf("\\") + 1), fs, true);
    //your logic here
}

If you observe, I am, here taking the stream from the fileupload control by using fileUpload.PostedFile.InputStream instead of passing the file name to File.Open().

Be careful while using the streams and know the difference between the File.Open() FileStream and the FileUpload Stream.
Hope this helps you to understand the both objects how to use and where to use.

Read More...

Unpacked

U - Underwear is in which box?
N - Need curtains that aren't just sheers...the neighbors are peeping...
P - Please tell me that isn't really broken...
A - AAAAAAAAAAAH!
C - Can't find my other shoe... :(
K - Krispie Kreme ...where are you when I need you?
E - Everything is a mess...
D - Dead on my feet...have one more day of this and I should be done. Read More...

Checkbox and Checkbox list "value" attribute

In some projects, I need to create ASP.NET controls and HTML controls on a web page and need to get the values of the controls with Request.Form[] method. Because this is the only way we can take the values of HTML controls when they are not declared with runat=server. [You take a look at my other post which explains how to take values of HTML controls in c#.] When I declare ASP.NET checkbox on a page, the value Request.Form["checkboxname"] always returns on or off. So, I never get the value of it. Because the ASP.NET checkbox don't have attribute for Value. So this is the same problem with <asp:CheckboxList> control. When I bind the data source for it, on the page it renders as two controls for each check box. One is Input with type checkbox and another is Label which holds the value as display text. So when I try to get the value I never get the check box value at all. i.e. if I declare checkboxlist for showing check boxes as ASP.NET, SharePoint, Java and want to know the value of user selection  by using Request.Form[], I always get the value on or off but not actual strings. So, how to get the value from the checkbox?

In these type of scenarios, I never use <asp:checkboxlist> on my web page. I always use multiple check boxes as a list to render on the page. This is because of getting the value in the Request.Form[""].

We need to declare a checkbox on the page and then use the below  line in c# code as shown below to set the value attribute manually.

cb.InputAttributes.Add("value", "checkboxvalue");

[If you need to build the checkboxes dynamically, then in for loop you can write the string which has the HTML check box declaration with input type="checkbox” and can access them in C# code.] So that it will render the value attribute for it. InputAttribute is the function we need to use to render it. So the final out put which render on the browser side looks like this.

<input type="checkbox" id="cb" name="cb" value="checkboxvalue" />

So, now when you try to get the checkbox value using

Request.Form["cb"] you will get the original value instead of the on or off.

And see my other post related to it, which explains you on "how to access the checkbox list values on client side".

Hope this will help to the guys, who are having the same requirements. Please give me your valuable feedback/comments. Do you think, is there any approach which solves the problem in easy way?

Read More...

Rain rain go away...

Anna's tired of staying inside all day...

Her and Bella want to play...

Outside in the grass some day... Read More...

Microsoft Visual Studio 2010 – The great IDE in the IT world

Visual Studio 2010. wow!!! Lot of features. What a product from Microsoft. It has almost everything integrated for developing all technologies related to ASP.NET like Silverlight, ASP.NET, SharePoint, WPF and QA related. The version of ASP.NET supporting is framework 4.0. This is again a big change.

We will take a look at the features available in the Visual Studio 2010.

First I am a SharePoint guy, so I will start from SharePoint tools available in Visual Studio 2010.

  • Till now, we have an option to add a new item for ASPX, ASCX, CSS, JS, Cs etc… Now we have an option to add a new web part project item and the Visual web part designer which loads a user control as a web part for SharePoint. Wow, how cool it is! Pretty much good.
  • We have another fantastic facility for adding an event receiver for SharePoint List/Library/Site and using the wizard to choose the event receiver type. From this, it will create a source file with that event receiver. [Again no need of creating a special project and write everything manually.]
  • A special explorer window which will pull information from the SharePoint sites like Lists, Libraries and other artifacts in SharePoint directly inside of Visual Studio when connected to a SharePoint site. It just like what we are seeing the team explorer in Visual Studio 2005/2008 when connected to Team Foundation server.
  • We have a package explorer to configure WSP files, feature files and other package related files in SharePoint. This is good thinking by the team to keep track of all the solution we are using in a SharePoint site at one place. The key F5 will compile, build, debug and deploy the solution to the specific site or farm depending on the configuration.
  • Till now if you want to create a SharePoint custom workflow we used to create a C# SharePoint workflow project and configure manually to implement it. But now, in this version they have added an ASPX workflow initiation form to a workflow project.
  • We already know about the WSP builder. From which we can create the solution file easily without doing any extra configurations etc, etc. Visual Studio 2010 includes the WSP file import to create a new solution automatically.

Second, I like Silverlight technology and development. I will explain the Silverlight features available in Visual Studio 2010.

  • While creating a Silverlight project you have an option to select the Silverlight version like 2.0 or 3.0
  • A good and great news for Silverlight developers that Visual Studio 2010 supports for both Silverlight development and editable design surface. WOW!!! this is the best and great feature. Isn’t it? [Thanks to Microsoft. When i am developing Silverlight application I am always having a thought that why MS VS team didn't give option to edit the Silverlight objects in editor inside Visual Studio. If I need to edit any thing I always need to go to Microsoft Expression Blend. But now editor is available inside it.]
  • More features available with Silverlight 3.0 in Visual Studio 2010. We will discuss this in upcoming blog posts.

Another feature that Visual Studio 2010 is integrated a great module for testing [QA].  You can take a look at it here.

The final feature is very cool feature that the new look for Visual Studio 2010. Very good rich and cool UI and very good user experience. I will come with another post that will give you good information about it with nice screen shots.

I think this information will help you to understand the features about Visual Studio in different scenarios. Keep an eye on the blog for more updates.

Read More...

Flossenburg Work Camp

**warning - may contain what would be considered gruesome content and photos**

It feels inappropriate to say that I was "excited" to go see my first concentration camp...I've been fascinated for years. As wild as this may sound...I think I get it from my mother, who is also fascinated by WWII, concentration camps, Nazi Germany, Hilter's hold on people's mind and heart... I started out with the Diary of Anne Frank then moved on to the Hiding Place...two books on my must read lists still today. So, it was with "excitement" that I learned that a tour of the Flossenburg Work Camp would be available to me on Memorial Day. After talking to John, we decided it would be a most appropriate activity to remind ourselves how truly thankful we are for freedom and those who fight to keep it that way...

There is conflicting information regarding the actual numbers of prisoners that lived and died here, so take the following as what I was told during my visit. Flossenburg Work Camp was opened in 1938. Its about 30 minutes from Vilseck (where I live) and set in a beautiful valley surrounded by green forest and hills. It was the fourth concentration camp to be built in Germany, however, they didn't call it a concentration camp...they called it a work camp because the prisoners were set to work cutting granite out of stone quarries....as you see the photos below you will see that all of the rock was from this quarry. Now, even though they called it a work camp while others such as Dachau and Auschwitz were called "Death Camps", do not be fooled...as our dear tour guide, Herb, put it...instead of just being shot or gassed or hung or burned these poor souls were worked to death.

The original camp was built for 1600 prisoners, however by the end of the war, over 100,000 prisoners were housed in Flossenburg and its sub camps (approximately 18,000 in the main camp) located around the quarry. There was an estimated 73,000 prisoners killed at Flossenburg and when it was liberated in April 1945, the US Army found only two thousand sick prisoners left in the camp. An additional 14,000 had been forced on a "death march" for three days...4,000 died in those three days before the US Army caught up to them and saved the remaining 10,000...

Here is what is left of Flossenburg in photos...

The SS Headquarters building located in the SS barricks area...this was built after the camp had been opened for a few years when it was growing quickly with more and more prisoners being transported here.


Our tour guide - Herb is a Prestbyterian minister...his family is from Flossenburg and his parents and grandparents remember well the days of the concentration camp. It was fascinating to hear his family's recollections and stories...probably what is most interesting to me was his comment about how the German people that lived less than a mile from this terrible place could not have KNOWN what was happening.

Herb told me that these German people had been told that this prison was for criminals, murderers and rapists that were dangerous. He reminded me that the people from the city were not invited to tour the facilities or even go near the various camps or quarry. He asked me to imagine living near a prison in the US...would I try to liberate or free those prisoners even if I thought they were being mistreated? He also reminded me that there was no television in those days and newscasts were all pro-nazi and would never tell the truths about camps like Flossenburg. As unbelievable as it sounds...many of these townspeople really did have no idea the true evil happening in these camps. His explanation opened my eyes just a little more to how these terrible things could be allowed by good people.


As you enter through the Headquarters building you look off to the left and see a tall hillside. This hillside is now covered in german homes, however it was the location of 16 prisoner barricks. They were built to house approximately 100 prisoners each. By the end of the war there were more than three prisoners per bunk...

How does it feel, I wonder, to have a home built on soil where those barracks once were?


This is the small garden that was the original memorial created for the survivors of the concentration camp. Anybody looking at these crosses would assume a religious Christian significance, however, what I learned is that these are actually nazi symbols that were in the concentration camps. In 1995 there was a reunion of survivors. Up to that time only a small garden filled with these crosses has been created as a memorial. It wasn't clear if it was the survivors that complained or I wonder if their silence said all that needed to be said because shortly after that the German government put about 2 million $ into a new memorial site. For whatever reason, they chose to leave this section alone and build around it instead of tearing it down.



Now we turn to the far southwest corner and one of the last two corner guard towers that are still standing. Those homes built on the hill are shadowed by this site. I wonder what they must think as they look out their windows in the morning....or do they think of it at all? The building is a church that was built later as part of the new memorial site. I was taken back by how truly beautiful this area was...can you imagine it as a concentration camp? I couldn't...


And then, we look to the northwest corner...and I saw the smokestack...the crematorium...I was surprised by how small the building was at first...no more than the size of a living room. How, did it handle the number of deaths that occured in this camp? Herb told me that the camp originally had contracted the services of the funeral home in the city of Flossenburg, but as more and more prisoners were sent to the camp and more and more deaths occured they built this crematorium.


Inside the building were three small rooms. As I turned into the first room I was shaken from the site of the oven. It sat in the room and seemed to suck the air out of it and the reality of this place settled heavily on my shoulders.


The second room was empty...I learned later that it was where they "cooled" the bodies. They threw them into a pile before burying them or burning them...the photos of what was found in that room at the liberation of the camp brought tears to my eyes.

The third room, I discovered, was the "operating room" for hurt or sick prisoners. If they died then it was a short toss into the cooling room...it was also where they laid the dead prisoners to remove their teeth. As I stood in there I found that I was hugging myself. I didn't want to touch that table. I didn't want to touch the walls. I knew there had been terrible suffering in the place where I was standing.


After leaving the crematorium you see a large valley with the church on one end and the crematorium on the other. In the middle is a remembrance for those that died. First, a long flat area with a sign that says, Prisoners were shot in mass here. Then a mound of earth with a sign that says, "These are the ashes and bones from mass-burnings." Because of the significance of Memorial Day, the Veterans of Foreign Wars were conducting a memorial service. I stood silently and watched the group of soldiers I was with standing at attention as Taps was played out by a bugler...

Day is done, gone the sun, From the hills, from the lake, From the skies.
All is well, safely rest, God is nigh.

Go to sleep, peaceful sleep, May the soldier or sailor, God keep.
On the land or the deep, Safe in sleep.

Love, good night, Must thou go, When the day, And the night Need thee so?
All is well. Speedeth all To their rest.

Fades the light; And afar Goeth day, And the stars Shineth bright,
Fare thee well; Day has gone, Night is on.

Thanks and praise, For our days, 'Neath the sun, Neath the stars, 'Neath the sky,
As we go, This we know, God is nigh.

The sign the prisoners saw as they entered the gates of Flossenburg read, "Work shall set you free." As I read the letters and stories of the survivors I couldn't help but wonder how so much evil could occur in a place that was so beautiful. The sun was shining and birds chirping...there was a slight breeze that whispered through the trees and the flowers were in full bloom...

ON A SUNNY EVENING - Third poem in the Terezin Concentration Camp Children's Cantata

On a purple, sun-shot evening
Under wide-flowering chestnut trees
Upon the threshold full of dust
Yesterday, today, the days are all like these.

Trees flower forth in beauty,
Lovely too their very wood all gnarled and old
That I am half afraid to peer
Into their crowns of green and gold.

The sun has made a veil of gold
So lovely that my body aches.
Above, the heavens shriek with blue
Convinced I've smiled by some mistake.
The world's abloom and seems to smile.
I want to fly but where, how high?
If in barbed wire, things can bloom
Why couldn't I? I will not die!

--Michael Flack, 1944
Read More...
Related Posts with Thumbnails
GiF Pictures, Images and Photos