Content Type Publishing – Rare Answers to Common Questions

At our meeting for August I had some very intelligent questions asked of me.  Ones I didn't know the answer too, until now!  Here's the questions:

  • What happens to a list that has been assigned a content type that was content published when we change and republish the content type.  Is it business as usual in that the content type will update across the lists?
  • What happens to a list that has been assigned a content type and we move that list to another site? 

Very interesting questions, and my default answer would be to say that publishing is nothing more than automating the "Creation" of content types, and therefore, all previous rules will apply here.  Let's take a look at the basics.

Basic #1:  Does a list get updates of a content type?

  1. Create a new content type
  2. Assign it to a list
  3. Create a new item based on the content type
  4. Update the content type
  5. Review if the content type gets the updated columns/settings

ANSWER:  A list will only get the items updated if you set the "Update all content types inheriting from this type?" radio button to TRUE!

Basic #2:  Does an exported list to a new site collection keep the content type?

  1. Create a new content type
  2. Assign it to a list
  3. Create a new item based on the content type
  4. Export the list with contents as a template (.stp)
  5. Restore to a new site collection
  6. Review if the content type is created/attached

 ANSWER: Yes!  The content type will in fact move to the new site collection and be assigned to the newly created list!

 CTHub #1:  Does a list get updates of a content type via a Managed Content Hub?

  1. Create a new Managed Hub
  2. Configure the MMS to point to the hub
  3. Create a new content type in the hub
  4. Publish the content type
  5. Wait for the content type to be published
  6. Assign it to a list
  7. Create a new item based on the content type
  8. Update the content type
  9. Wait for the publish
  10. Review if the content type gets the updated columns/settings

Yes!  This implies that the "Update all
content types inheriting from this type?" radio button to TRUE!

CTHub #2:  What happens if you create a list template and move the list to another site collection, then update the Content Type in the hub?

ANSWER:  The newly created list WILL get the updates from the hub as long as the site collection is located inside the same web application.

 CTHub #3:  What happens when you publish a content type with the same name as one already existing?

ANSWER:  The content type will not be updated, the Content Type Publishing logs will display an "Exists" error.


SharePoint Logging Database Exposed

So what exactly does this thing do anyway other than keep growing HUGE?!?  Let's take a look!  A little background…I have been running SharePoint 2010 RTM since it was released and since then the Logging database has grown to 500MB.  The database itself seems to be dynamic in that you will start off with a small number of tables, but as your increase your feature usage, more tables will be added.

Question #1:  Just how many tables does your system have?  Here's my current list (but you may have even more!):

  • AnalysisServicesConnections
  • AnalysisServicesLoads
  • AnalysisServicesRequests
  • AnalysisServicesUnloads
  • Configuration
  • ExportUsage
  • FeatureUsage
  • ImportUsage
  • MonthlyPartitions
  • NTEventLog
  • PerformanceCounters
  • RequestUsage
  • Search_CrawlDocumentStats
  • Search_CrawlProgress
  • Search_CrawlWorkerTimings
  • Search_PerMinuteFTQueryLatency
  • Search_PerMinuteTotalOMQueryLatency
  • Search_PerMinuteTotalQueryLatency
  • Search_PerMinuteTotalUIQueryLatency
  • Search_QueryErrors
  • Search_VerboseFTQueryLatency
  • Search_VerboseOMQueryLatency
  • Search_VerboseQueryProcessorLatency
  • Search_VerboseUIQueryLatency
  • Search_VerboseWebPartQueryLatency
  • SiteInventory
  • SQLDMVQueries
  • SQLMemoryQueries
  • TimerJobUsage
  • TraceDiagnosticsDummy
  • ULSTraceLog
  • Versions

All of those of course have their partitions to them.

Question #2: What the hell is a partition and why do I have 32 of them?

You will see that there are a max number of 31 partitions.  But you will also notice that there is a partition number 0.  That is 32 days of partitions?  I think this is a bug in the stored procedures when the table partitions get created.  But the idea is to keep the set of daily information broken apart via a rolling schedule (this means that day 9 may not map to the calendar day 9).  You can look at the Configuration table to find what the current partition is.  As the Timer Job is run to process the log data, it will move to the next partition and recycle as it goes.

Question #3: How do I turn the freakin thing off?

This can be accomplished by going into Central Administration->Monitoring->Configure usage and health data collection.  Uncheck the "Enable usage data collection" checkbox and the "Enable health data collection" checkbox

Question #4:  I like the data, but there is too much, how do I shrink it?

If you explore the Stored procedures in the database, you will see that it simply truncates when it moves to a new partition.  You can do the same by looping through all the tables and truncating the tables.   You can also just run the function called "fn_TruncateUnusedPartitionsHelper"

Question #5:  I like the data that is generated, can I please have some more?

Hell yeah you can, if you explore the timer jobs titled "Diagnostic Data Provider*" you will see they are disabled!  Turn them on and you will get even more data around:

  • Event Log
  • Performance Counters – Database servers
  • Performance Counters – Web Front Ends
  • SQL Blocking Queries
  • SQL Memory DMV
  • Trace Log

Ayman has a nice post on how you can use .NET Reflector to get in and see how these monsters do there dirty work here

Question #6:  Can you have too much data?

Oh yeah!  You can easily hit the 4GB limit on this database, for those of you running SQL Express (why?), you will be truncating tables quite frequently!  For those of you with limited disk space, get ready to also be truncating tables.  Those of you with HUGE disk arrays, well…you should be fine.

Question #7:  Where is this documented schema thing I have seen so many marketing slides about?

Who the heck knows!  It seems that anyone and any application is able to write to this thing.  Even you could write into it with your own application.  I'm not sure how you get a new set of partition tables setup and the attributes created, but eventually I will find some time to build something for it 🙂



SharePoint Tagging Exposed

I'm sure all of you have seen the new tagging feature of SharePoint.  This particular feature is a part of the User Profile Service Application AND Metadata Service Applications.  If you don't have one of these created, then you simply won't get the ability to do tagging.  If you do however, you can tag things as a generic "I Like It" or something more specific like "Awesome".  I have had a few different questions asked of me over the past few months and here I'll answer them all:

#1 If a user tags something, will other users see the tag?

ANSWER: Depends!  There is a checkbox when you attempt to tag an item.  If you check it, it will save the tag specifically for you and no one else.  If you do not check the box, then when some goes to tag the same item, you will see a suggested tag list made up of all tags that have been placed on that item.

#2 If a user tags something, will they see that tag indexed in a search for the item?

ANSWER: No, I have not gotten SharePoint or FAST Search to index the tags 🙁  This can certainly be setup with a custom iFilter or extend point in FAST Search content processing pipeline, but it doesn't look to be out of the box.

#3 As an administrator, how can I search on all the tags that have been created?

Tags are saved as Metadata terms in the Managed Metadata Service Application that is assigned to the web application you are working with.  You can simply view all the terms under the "System->Keywords" group of your MMS service application.   You can also run the following query on your MMS database to get a listing of the tags and who "tagged" it:


  FROM [ECMChangeLog] ecl, ECMTerm t, ECMTermLabel etl
  ChangeType = 1
  ecl.ObjectUniqueId = t.UniqueId
  ecl.ObjectId = etl.TermId 

#4 How do I query  what has been tagged?

ANSWER: All items that have been tagged are not stored in MMS, but instead in the UserProfileServicesApplication_SocialDB_* database and in the SocialTags table.  A simple query of that table will give you what you are looking for.  Other helpful tables include:

  • SocialTagCloud_Everyone
  • SocialTagCloug_EveryoneByUrlBase
  • SocialTags
  • SocialTags_ChangeLog

#5 Are tags filtered (as in bad words like *bleep*)?

ANSWER: No, and ForeFront doesn't filter them either (reference this post).

#6 Can tags be filtered?

ANSWER:  To answer this question, one must analyze the architecture of the tagging system.  Since all the tag keywords are stored with MMS, the question then becomes, can I filter MMS?  Looking at the structure of the MMS tables, we have:

  • ECMTerm
  • ECMTermDescription
  • ECMTermLabel
  • ECMTermProperty
  • ECMTermSet
  • ECMTermSetMembership

The value for the term is actually stored in the ECMTermLabel property table.  Check out my new product that does SharePoint Social Filtering :

#7 What does the "Tag Cloud" web part do?

Being that SharePoint doesn't have any filtering…this web part can cause some pretty nasty HR problems.  As long as your users are doing what they are suppose to be doing, then this simply queries all the tags in the Metadata store and displays them.


PowerPivot and Excel Services

While working with getting PowerPivot and Excel Services working, I ran into this very annoying error:

ServerSession.ProcessServerSessionException: An exception during ExecuteWebMethod has occurred for server: http://servername:32843/811dd9f4d4ae48779f1dc7a03ba5d555/ExcelService*.asmx, method: GetHealthScore, ex: Microsoft.Office.Excel.Server.CalculationServer.Proxy.ServerSessionException: An error has occurred. —> System.ServiceModel.FaultException`1[System.ServiceModel.ExceptionDetail]: The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040) (Fault Detail is equal to An ExceptionDetail, likely created by IncludeExceptionDetailInFaults=true, whose value is: System.IO.FileLoadException: The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)    at System.Reflection…    

Nothing in the event log…only this error with no reference to the assembly name or anything…annoying.  I found this nice post about debugging these type of errors:

You must download the SDK to get the tool:

After installing, I wasn't seeing any binding errors.  In the documentation, you're suppose to restart the services that are using .NET after setting the logging.  I rebooted the server and the error didn't return…weird.


PowerPivot and Claims based authentication

Unfortunately, I have NOT gotten this to work.  Only on a web application that has "Classic Authentication" set has it worked.  If you setup a web application to have claims based auth with both a "Forms" and a "Windows" login capability, the PowerPivot code will fail in its login method:

<nativehr>0x80070005</nativehr><nativestack>OWSSVR.DLL: (unresolved symbol, module offset=00000000000BB1EC) at 0x000007FEEFEFB1EC mscorwks.dll: (unresolved symbol, module offset=00000000002CF777) at 0x000007FEF62BF777 (unresolved symbol, module offset=00000000000E7BAA) at 0x000007FEF23A7BAA (unresolved symbol, module offset=0000000001A5D823) at 0x000007FEEC87D823 (unresolved symbol, module offset=0000000001AD201F) at 0x000007FEEC8F201F </nativestack>Access denied.

Only way to get this to work is to reset the web application to "Classic Authentication" and then the code works.  This is with the latest cumulative updates applied.

Another blog post by Denny mentions a bit more into this, but the code should work as it is calling the object model to simply update the workbook in the list on the site 🙁


PowerPivot and SharePoint Farms

When installing PowerPivot in your farm, you will have to manually add the following line to your web.config SafeControls section of EVERY front end server, you will also have to add this back everytime you make a change via the Web Application settings page in Central Administration:

<SafeControl Src="~/_layouts/powerpivot/*" IncludeSubFolders="True" Safe="True" AllowRemoteDesigner="True" SafeAgainstScript="True" />

If you don't do this, then you will get this error:

“The referenced file ‘/_layouts/PowerPivot/ReportGalleryView.ascx’ is not allowed on this page."

This should have been done in the Solution install, but they missed that step.  Hopefully next version of the .wsp will contain the addition

Special thanks to for the helpful hint


Regional Site Settings and Indexed Columns

What I believe to be a new change to SharePoint 2010 is the addition of some error checking around Regional Settings and indexed columns in lists. When you try to change the regional settings and sort order to something with a different collation type (sorting based on a-Z0-9, or 'a' is bigger than 'A' type of stuff), you will get an error about the indexed columns.  Being that I have turned on all site and site collection features, several lists have been created with indexed columns.

Moral of the story is, you MUST set the sort order of the site FIRST, before you add any indexed columns and turn on any site and site collection features.  If you don't, then you are in for a few hours of work to remove them and then add them all back!


SharePoint 2010 Document Id Service

In working with the document id service in one of my labs I found some interesting "features" of this new SharePoint 2010 feature.

  • When you activate the Site Collection Feature, you MUST wait for the timer job to complete BEFORE you change the site prefix.  If you move to fast for SharePoint and you change the site prefix, you will HOSE the site collection timer job setup.  I have found no way to fix this.  You will never see the Document ID column show up in the libraries.
  • After the timer job runs, if you want to change the prefix, you must update each item before the ID resets. For some reason, it is not changed via the timer job 
  • The format of the out of box ID Provider is SITEPREFIX-LISTID-ITEMID where LISTID is the autoincrement integer value for each list that is created on a web and the ITEMID is the autoincrement integer value for the item in the list
  • ID's are ONLY generated for "Document" content types…this is LAME.  I want all documents that inherit from "Document" to get an ID.  SP Team, this is easy…check for the Content ID lineage prefix of 0x0101 and simply assign the ID!
  • You can create your own Document ID Provider if you don't like this particular ID generation (part of my SharePoint Server Dev course)

Document ID Service has a ways to go, but it will get there!

SharePoint 2010 Column and List Validation

Another interesting feature I have found with column and list validation.  This feature works as described, but unfortunately only for NEW items.  If you enable column or list validation AFTER items have been added to a list, you will get an annoying error telling you the item was updated previously.

This does not happen on a new item addition.  Basically this means you have to create all your columns and validation BEFORE you start adding any items to that list/library!


SharePoint Designer 2010 Workflow Bug (Check out causes Exception)

I was doing a demo today around SharePoint Designer 2010 workflows and found a bug in the design.  When you create a workflow and want to edit the xoml file for the workflow, you must check it out.  Why would you do that you ask?  Well, the Designer Interface is far from perfect and you may need to move a step around in case you didn't realize you started to sub-nest your steps. The only way to do this is to edit the .xoml file.  In order to edit the .xoml file, you must "Check Out" the file first.  After you check it out, you can then edit the file in Designer via the XML Editor.

Saving the file but not "Checking It Back in" will cause the file to lose resolution to the SharePoint workflow code.  This will cause the following error in the log files:

System.ArgumentNullException: Array cannot be null.  Parameter name: bytes     at System.Text.Encoding.GetString(Byte[] bytes)     at Microsoft.SharePoint.Workflow.SPNoCodeXomlCompiler.CompileBytes(Byte[] xomlBytes, Byte[] rulesBytes, Boolean doTestCompilation, String assemblyName, SPWeb web, Boolean forceNewAppDomain)

Checking the file back in will allow the workflow to execute again.  

RESOLUTION:  A check out should cause the workflow to be unpublished. You would then need to save and publish it again after check in