SharePoint 2010 Filter Web Parts

Where did they go?  Hmm, they are still in the same namespace, but they have been moved out of the Microsoft.SharePoint.portal.dll.  They are now located in the Microsoft.Office.Server.FilterControls that is only located in the GAC.

Check out my reply in the MSDN Forums.

Chris

META: <type
name="Microsoft.SharePoint.Portal.WebControls.QueryStringFilterWebPart,
Microsoft.Office.Server.FilterControls,…>

Scripting Sharepoint 2010, Zero to C (as in E=MC^2)

So you wanna script your entire SharePoint 2010 Farm install eh?  Want me to show you how to do it?  I'm sure you do!  Here's the steps:

set share=”//servername/PreReqs”

prerequisiteinstaller
/unattended /SQLNCli:%share%/sqlncli.msi /ChartControl:%share%/MSChart.exe
/IDFXR2:%share%/MicrosoftGenevaFramework.amd64.msi
/Sync:%share%/Synchronization.msi /filterpack:%share%/filterpack.msi
/ADOMD:%share%/SQLSERVER2008_ASADOMD10.msi

  • Create a config.xml file

<Configuration>
<Package Id="sts">
<Setting
Id="LAUNCHEDFROMSETUPSTS" Value="Yes"/>
</Package>
<Package
Id="spswfe">
<Setting
Id="SETUPCALLED" Value="1"/>
<Setting
Id="OFFICESERVERPREMIUM" Value="1" />
</Package>
<Logging
Type="verbose" Path="%temp%" Template="SharePoint
Server Setup(*).log"/>
<PIDKEY
Value="{YOURKEY}" />
<Display Level="none" CompletionNotice="yes"
/>
<Setting
Id="SERVERROLE" Value="APPLICATION"/>
<Setting
Id="USINGUIINSTALLMODE" Value="0"/>
<Setting
Id="SETUPTYPE" Value="CLEAN_INSTALL"/>
<Setting
Id="SETUP_REBOOT" Value="Never"/>

</Configuration>

  • Run the following command pointing to your config.xml file to install sharepoint

setup /config <pathto>config.xml

  • Configure SharePoint 2010

set s="C:Program FilesCommon
FilesMicrosoft Sharedweb server extensions14BINstsadm.exe"
set ps="c:program
filescommon filesmicrosoft sharedweb server
extensions14inpsconfig.exe"
set farmadmin=CONTOSOSP_Farm
set sql=DBNAME
set p=Pa$$w0rd

%ps% -cmd configdb -create -server
%sql% -database SharePoint_Config -user %farmadmin% -password %p% -passphrase
%p% -admincontentdatabase SharePoint_AdminContent
%ps% -cmd adminvs -provision -port
9999 -windowsauthprovider onlyusentlm
%ps% -cmd services install
%ps% -cmd secureresources
%ps% -cmd installfeatures

  • Start all Services

Get-SPServiceInstance | foreach-object {Start-SPServiceInstance
-identity $_.Id }

  • Create ALL the service applications

$DbServerAddress = "DBNAME"

$farmPassPhrase = ’Pa$$w0rd’

$svcPwd = ’Pa$$w0rd’

$username =
"CONTOSOSP_Service"

$password = ConvertTo-SecureString
$svcPwd -asplaintext -force

$credential = New-Object
System.Management.Automation.PSCredential $Username, $Password

$managedAccount =
new-SPManagedAccount -credential $credential

$app = New-SPIisWebServiceApplicationPool "All Services" -account $managedAccount

#New-SPUsageApplication -name
"Usage and Health Service Application"

New-SPAccessServiceapplication
-applicationpool $app -name "Access Services"

New-SPBusinessDataCatalogserviceapplication
-applicationpool $app -name "Business Connectivity Service Services"

New-SPExcelServiceApplication
-applicationpool $app -name "Excel Services Application"

$md = New-SPMetadataServiceApplication
-applicationpool $app -name "Managed Metadata Service"

New-SPMetadataServiceApplicationProxy
-name "Managed Metadata Service" -serviceapplication $md

$pps =
New-SPPerformancePointServiceApplication -applicationpool $app -name
"PerformancePoint Service"

New-SPPerformancePointServiceApplicationProxy
-name "PerformancePoint Service" -serviceapplication $pps

New-SPStateServiceApplication -name
"State Service"

$ps =
New-SPProfileServiceApplication -applicationpool $app -name "User Profile
Service"

New-SPProfileServiceApplicationProxy
-name "User Profile Service" -serviceapplication $ps

$vgs =
New-SPVisioServiceApplication -applicationpool $app -name "Visio Graphics
Service"

New-SPVisioServiceApplicationProxy
-serviceapplication $vgs -name "Visio Graphics Service"

$was =
New-SPWebAnalyticsServiceApplication -applicationpool $app -name "Web
Analytics Service Application"

New-SPWebAnalyticsServiceApplicationProxy
-serviceapplication $was -name "Web Analytics Service Application"

New-SPWordConversionServiceApplication
-applicationpool $app -name "Word Service"

$serviceapp =
New-SPSecureStoreServiceApplication -Name "Secure Store Service"
-partitionmode:$false -sharing:$false -databaseserver $DbServerAddress
-databasename "SSO" -applicationpool $app -auditingEnabled:$true
-auditlogmaxsize 30

$proxy = $serviceapp |
New-SPSecureStoreServiceApplicationProxy -defaultproxygroup:$true -name
"Secure Store Service Proxy"

Update-SPSecureStoreMasterKey
-ServiceApplicationProxy $proxy -Passphrase $farmPassPhrase

Start-Sleep -s 5

Update-SPSecureStoreApplicationServerKey
-ServiceApplicationProxy $proxy -Passphrase $farmPassPhrase

$searchapp =
New-SPEnterpriseSearchServiceApplication -name "Search Service
Application" -applicationpool $app

$proxy = New-SPEnterpriseSearchServiceApplicationProxy
-name "Search Service Application Proxy" -searchapplication
$searchapp

$si =
Get-SPEnterpriseSearchServiceInstance -local

Set-SPEnterpriseSearchAdministrationComponent
-searchapplication $searchapp 
-searchserviceinstance $si

$ct = $searchapp |
New-SPEnterpriseSearchCrawlTopology

$crawlStore =
$searchApp.CrawlStores | where {$_.Name -eq
"Search_Service_Application_CrawlStore"}

New-SPEnterpriseSearchCrawlComponent
-searchapplication $searchapp -crawltopology $ct -searchserviceinstance $si
-crawldatabase $crawlstore

$ct |
Set-SPEnterpriseSearchCrawlTopology -active

Write-Host -ForegroundColor Yellow
"Waiting on Crawl Components to provision…"

while ($true) {

$ct =
Get-SPEnterpriseSearchCrawlTopology -Identity $ct -SearchApplication $searchApp

$state = $ct.CrawlComponents |
where {$_.State -ne "Ready"}

if ($ct.State -eq
"Active" -and $state -eq $null) {

break

}

Write-Host -ForegroundColor Yellow
"Waiting on Crawl Components to provision…"

Start-Sleep 2

}

$qt = $searchapp | New-SPEnterpriseSearchQueryTopology
-partitions 1

$p1 = ($qt |
Get-SPEnterpriseSearchIndexPartition)

New-SPEnterpriseSearchQueryComponent
-indexpartition $p1 -querytopology $qt -searchserviceinstance $si

$p1 |
Set-SPEnterpriseSearchIndexPartition

$propertyStore =
$searchApp.PropertyStores | where {$_.Name -eq
"Search_Service_Application_PropertyStore"}

$p1 |
Set-SPEnterpriseSearchIndexPartition -PropertyDatabase
$propertyStore.Id.ToString()

$qt
| Set-SPEnterpriseSearchQueryTopology –active

Your done, enjoy!
Chris Givens aka CJG

BCS/BDC Report Card

First and foremost, I LOVE SHAREPOINT.  It's awesome, it pays my bills and IT ROCKS.  I give credit where credit is due, but I'm also fair in that I will point out flaws too.  As some of you surely remember, I tweeted a little rant about BCS and how annoyed I was that customer after customer keeps asking me to make BCS do something it just can't do (point proven by the tweet I got from Paul Andrew "BCS is not a silver bullet", of which I replied, "It's a marketing problem", driven by conference presenters saying its the save all solution of the century).  A lot of people feel that because it can now both READ and WRITE data that you can DO ANYTHING with it.  WRONG.  So very wrong.  In an effort to prove that point, very simply here's my BCS Report Card (2007 scores in parenthesis):

Read Data     A+  (B)
Search Data  B-  (B-)
Write Data     D (I)

Why you say?  Why not A's across the board?  Let me explain it too you.  Let's start with a very simple table:

CREATE TABLE [dbo].[Contact](
    [FirstName] [varchar](50) NOT NULL,
    [LastName] [varchar](50) NOT NULL,
    [ModifyDate] [datetime] NOT NULL,
    [CreateDate] [datetime] NOT NULL
)

Using BCS, I can point to this data, read it, NOT search it and write it.  Why can't I search it?  I didn't give it a identifier/primary key.  That means I can't point BCS at repeated non-unique data for searching and indexing, and hence why it gets a B-.  Adding the following will make it searchable (and now usable in SharePoint Designer 2010):

ContactId int primary key 

NOTE* it doesn't have to be a primary key, just need a column to be an identifier

Perfect.  A few more clicks in Designer and I have an external content type with a list using it.  Another couple of clicks in the Ribbon and I can even have the data syncing with my outlook (of which Fabian Williams first blogged an example of this). A few more clicks in my Search Service Application and its indexed. AWESOME!  Let me repeat that…AWESOME!  The problem comes into play when I add another table with a relationship:

–NEW TABLE–
StatusId int primary key
ShortName varchar(50)
LongName varchar(50)
ModifyDate datetime
CreateDate datetime

–ADD REF COLUMN TO CONTACT TABLE–
StatusId int

Regenerating my BCS/BDC application (which is not very friendly in Designer, but that's for another time) generates the same basic view of the contact data. However, notice that THE USER IS RESPONSIBLE for determining what the value of the StatusId column is…REALLY?  REALLY?  Of course if they type in something bad they get this:

 

Wait, what was that checkbox on the List generation?  Generate InfoPath form? 

 

That sounds promising, but:

 

The point is that BCS/BDC really only works with a single non-relational entity when it comes to WRITING of data.  Even given the fact that with SharePoint Designer (or manually if you are apt too because you get finally get so frustrated that you want to throw your laptop when doing it with Designer) you can create "Associations" to build relationships between similar or disparate data sources (another awesome feature), it really only helps with the BDC web parts for sending of filtering values.  What does this mean?

BDC ONLY WORKS WITH DE-NORMALIZED MEANINGFUL TABLES WHEN YOU WANT FULL* FUNCTIONALITY

The asterisk on "Full" simply means, the functionality offered by BCS (ref scorecard above), not what you would REALLY want. 

What do I think would be ideal?  The best options is to allow us to edit the InfoPath Form!!!  Then we can map the control to a drop down populated from a data source!  One would think that we should be able to edit the list column and point to corresponding column in the target association table to generate a selectable drop down (or even select the column in the BCS wizard or schema), but those are locked:

 

OR even better, when *generating* the BCS/BDC application definition, make it possible to create Metadata "Pointers" in the metadata service to these "distinct" items and have the column use the metadata service. 

 

But even if they (SP Product Team) go to the extreme to implement all this later (which seems inevitable as soon as they see this), "distinct" items brings in a whole different set of problems like caching and large list like issues (being they have already addressed this pattern of problem in large lists, seems trivial for them).

Why did I asterisk out "generating". BCS is a code gen tool.  It generates the BDC application definition file (and some other medium difficulty plumbing) for you with a fancy wizard.  Let me add, its a very BASIC codegen tool.  In the post that will follow this one, I will demonstrate for the first time my CodeGen tool (with its SharePoint extensions).  Watch for that post!

Some of the comments I received when I tweeted my original rant where:

  • Fabian & Chakkaradeep – use a .NET type : this is why I give an A+ to READ, but still doesn't resolve the WRITE issues
  • Some said use BDCMetaman, sorry it doesn't solve these issues either, its just another codegen tool like BCS

In summary, to quote a really great movie "Choose, but choose wisely" your integration points with SharePoint.  In the end you may fall over dead and old before you get it working (or realize it just won't work) with BCS (at least in this version #2).

Chris

SharePoint 2010 Claims based Forms Authentication

Forms based authentication in SharePoint 2010 has changed.  It is now Claims based Forms Authentication, which means all the forms configuration stuff you see all over the web doesn't quite work in the same way.  Even if you set the membership and role providers in your web
applications and central administration you forms auth still won't work!

In SharePoint 2010, several new WCF services will be handleing the mapping of the claims to the backend forms auth identities.  Out of the box, claims based forms auth doesn't work!  You gotta remove the <clear/> element from the Membership and Role provider sections in the web.config of the {SharePoint Root}WebServicesRoot.

Chris

WCF Timeouts on small memory’d SharePoint 2010 machine

In writing labs for an upcoming Microsoft Course, I found that the limitations on my environment was less than satisfactory to run all of the services of SharePoint 2010 at the same time.  Even though all Web Application and app pools were running, I was getting WCF timeouts mainly for the User Profile Service.  The default timeout value in SharePoint for the services is pretty much set to 20 sec.  If you have a machine that doesn't have the full 8GB of memory needed, then you may find your services are taking a while to spin up.  This can be devistating to some of the setup/install things you may be doing (because there are still many things that are NOT transnational). 

I found that by increasing the WCF timeout, I could avoid a lot of setup/install problems (mainly around service applications like User Profile).  To change the WCF timeout, open the C:Program FilesCommon FilesMicrosoft SharedWeb Server Extensions14WebClientsProfileclient.config file, update the Bindings to double the timeout from "00:00:20" to "00:00:40".  Bam!  Your user profile won't timeout all the time and the service management page will display!

Chris

 

SharePoint 2010 Generic Solution Validator

Sandboxed Solutions are not a very practical way of  testing solutions in production.  It opens up a world of bad things to happen.  That being said, there are still some good things that can come from them. 

The blocking solution provided in Central Administration is a complete joke, it won't help you for governing sandboxed solution deployment AT ALL.  However, I have built a tool that will solve many of the deployment issues with Sandboxed Solutions.  I call it the Generic Solution Validator.  It is scoped to a Site Collection level and allows you to ALLOW or BLOCK solutions based on various properties.  Properties much more valuable than the SCA provides.

I have posted the Generic Solution Validator to codeplex. 

http://bit.ly/4V8Yv7

Enjoy,
Chris

SharePoint Remote Blog Storage (RBS) – Step by Step Install

I just setup RBS/RBS Filestream on my 2010 farm.  Not so sure about the functionality being pushed to the SQL Server team for this.  I really liked the COM based approach of EBS in 2007 (which is now marked as obsolete, but supposedly still supported – and when I say 'supported' I simply mean it still works – going forward in 2010), but hey, when you don't have budget/time, send it off to someone else that does right?  These steps will get your RBS FILESTREAM sql provider working sorry no screen shots, you only get those in the courses.

So why RBS? What is it you ask?  It stands for Remote Blog Storage.  We'll, lets start with the fact that when you add a document to SharePoint goes into the content database (the binary is serialized into a stream and put into the database, but not really put into the database, but a pointer in sql server row to a set of 8k pages somewhere on disk that represents those files).  All documents go into the content database with out discrimination.  Should there be discrimination?  Yep.  Some people would migrate file shares to sharepoint which would include install isos, some being in the gigabyte size.  These days we can safely say that SQL Server it is a high volume transactional store and yes, it has the ability to store blobs, just not as efficiently as some applications would like.  Because of this, IT depts would say "No" don't put that large file in sharepoint. 

Well of course that leads to confusion as to what goes in and out of SharePoint (the fileshare lived another day in 2007).  In 2007, the SharePoint team introduced EBS which was a COM component based implementation of passing off the BLOB to something else to manage.  RBS is the continuing evolution of this with SQL Server becoming the management point.  Now IT Depts can say, sure throw that into SharePoint!  No confustion, SharePoint is now the hub of everything!  IT Depts can set the size at which RBS kicks in and sends the file somewhere else.  If the RBS becomes obsolete there are ways to migrate back into SharePoint or change the RbsID to migrate to a new store (you will see the new RbsId column in the content database).  Writing an EBS or RBS implementation is NOT easy.  I did an EBS for 2007, and I'll be the first and not last to tell you, the COM interactions, memory management, and file manager components require some thought and patience.  That being said, you will be at the mercy of Microsoft and 3rd parties to create scalable robust EBS/RBS implementations for your sharepoint system.

After publishing this, we had a nice twitter conversation about StoragePoint. It has some awesome RBS connectors!  You should check them out!

The detailed TechNet version of RBS install process is here.  Mine is a more condensed version of the basic steps.

If you like this, be sure to follow me on twitter! More to come!

Exercise 1 – Setup RBS FILESTREAM

Purpose:
        Setting
up RBS FILESTREAM is fairly simple.  In this lab we
will configure RBS in SQL Server and then RBS Client for Sharepoint 2010

Result:           
A content database that uses RBS FILESTREAM

Task 1 – Enable
FILESTREAM on SQL Server

  1. Open SQL Server Configuration
    Manager
  2. Click “Sql Server Services”
  3. Right click “SQL Server
    (MSSQLServer)”, select “Properties”
  4. Click the “FILESTREAM” tab
  5. Check all the checkboxes

  1. Click “Apply”
  2. Open SQL Server Management
    Studio
  3. Connect to the localhost
    server
  4. Right click the instance,
    select “Properties”
  5. Click the “Advanced” tab
  6. For the “Filestream Access
    Level, select “Full access enabled”
  7. Click “OK”
  8. Restart the SQL Server
    service

Task 2 – Prep the
databases

  1. Open a query window, run the
    following sql command:


use [WSS_Content_100]

if not exists (select * from sys.symmetric_keys where
name = N'##MS_DatabaseMasterKey##')create master key encryption by password =
N'Pa$$w0rd'

 

  1. Run the following:


if not exists (select groupname from sysfilegroups where
groupname=N'RBSFilestreamProvider')alter database [WSS_Content_100]

 add filegroup
RBSFilestreamProvider contains filestream

 

  1. Run the following:


alter database [WSS_Content_100] add file (name = RBSFilestreamFile, filename
= 'c:Blobstore') to filegroup RBSFilestreamProvider

 

  1. Expand “Databases”
  2. Right click “Databases”,
    select “New Database”
  3. For the name type,
    “RemoteBlobStorage”
  4. Click “Ok”

Task 3 – Install the
RBS Client

  1. Run d:lab workRBS_x64.msi
    • NOTE: This task is to walk you through the GUI of the install
      program to see the various items that you COULD configure, later you will
      see that this is not necessary as we will re-run in a silent mode for
      SharePoint – product team has also suggested that you NOT run this step as it may add extra settings that could cause problems later!)

  1. Click “Next”

  1. Click “I accept the terms…”
  2. Click “Next”
  3. Click “Next”

  1. Click “Next”

  1. Click “Test Connection”
  2. Click “Next”

  1. Click “Next”
  2. Check the “Show the advanced
    configuration options” checkbox

  1. Click “Next”
  2. Review the settings:

  1. Click “Next”
  2. Review the properties of the
    “Maintainer Task”, this is used to clean up orphaned records that may not
    exist in SharePoint anymore (a user deleted the file in the document
    library).  Check all the checkboxes:

  1. Click “Next”
  2. For this lab, set all the
    logging setting to “Verbose”:

  1. Click “Next”
  2. Click “Install” – AGAIN NOTE – this is for FYI only, you should click "Cancel" if doing this for real
  3. Click “OK” in the task window

Task 3 – Configure
SharePoint 2010

  1. Open a SharePoint Management
    Console
  2. Run the following commands
    from the location of the RBS_X64.msi file (this would need to be run
    against each content database that you want to support RBS):


msiexec /qn /lvx* rbs_install_log.txt /i RBS_x64.msi
TRUSTSERVERCERTIFICATE=true FILEGROUP=PRIMARY DBNAME="WSS_Content_100"
DBINSTANCE="servername" FILESTREAMFILEGROUP=RBSFilestreamProvider
FILESTREAMSTORENAME=FilestreamProvider_1

 

msiexec /qn /lvx* rbs_install_log.txt /i RBS_x64.msi
DBNAME="WSS_Content_100" DBINSTANCE="servername"
ADDLOCAL="Client,Docs,Maintainer,ServerScript,FilestreamClient,FilestreamServer"

 

  1. Open the rbs_install_Log.txt
    file, at the end of the file, look for:
    • Product: SQL Remote
      Blob Storage — Configuration completed successfully.
  2. You can also check the content database and look for some new tables "mssql*"
  3. Run the following (note this
    only works if the web app has one content database):


$cdb = Get-SPContentDatabase –WebApplication 
http://servername:100

$rbss = $cdb.RemoteBlobStorageSettings

$rbss.Installed()

$rbss.Enable()

$rbss.SetActiveProviderName($rbss.GetProviderNames()[0])

$rbss

 


Task 4 – Test your RBS Provider

  1. On the SQL Server, open the “c:BlobStore”
    folder, this is where your blobs will go by default
  2. Open the team site (http://servername:100)
  3. Add a new document called
    “MyRBSFile” to your document library (make sure it is above 100K as you can set the file size boundary in RBS to move between content db and RBS connectors)
  4. Refresh the c:BlobStore folder,
    you should see a new file in one of the directories
  5. Run the following query
    against your Content database (NOTE: run this against your dev enviornment ONLY so as to not cause any locks on your prod databases):


select ad.SiteId, ad.id, leafname, rbsid

  from alldocs
ad, alldocstreams ads

  where ad.id =
ads.id

  and rbsid is
not null

 

 

 

  1. You should get a query back with information on all
    files that have been submitted to the RBS.
     

Enjoy!
Chris

Solution Validators – Sandboxed Solutions

So you read my last post and decided that maybe SandBoxed Solutions isn't that great of an idea.  You decided to implement a Solution Validator to limit what your ole developers are doing.

You got the validator created and you installed it, but then realized, its not quite right.  So, you undeploy it right?  Oh, wait, every object that goes in the Object hierarchy table has to have a "public" contructor for deserialization (ie, pulled out of the ConfigDB and turned into memory). Otherwise you get a nice error in Visual Studio and/or Central Admin and you won't be able to retract the solution.

Exception in RefreshCache. Exception message : "MySolutionValidator.MySolutionValidator cannot be deserialized because it does not have a public default constructor."

K, if you followed the little article up there, then you'll notice that particular piece is missing!  Now your stuck…how do you get it out?  Well, you gotta run a command against the config database: NOTE:  This is a highly dangerous operation, if you mess it up, your Farm gets deleted!

delete from Objects
where properties like '%MySolutionValidator%' –or to be more safe, the full assembly name

You could also run the following stsadm command provided you do the query to find the id in the Config database:

select id, properties from Objects
where properties like '%MySolutionValidator%'

STSADM -o deleteconfigurationobject -id “id retrieved from object table”

This will clear the object and the retract will succeed.  This will be the case for ANYTHING that goes into the object hierarchy table.

Chris

 

Missing Server Side Dependencies – 8d6034c4-a416-e535-281a-6b714894e1aa

So what is this you ask?  Well, I did a little digging, I watched the Timer Job and the query it sent ( to the content database of the central admin site):

SELECT tp_WebPartTypeId, COUNT(1), tp_Assembly, tp_Class
FROM AllWebParts (NOLOCK)
WHERE tp_WebPartTypeId IS NOT NULL GROUP BY tp_WebPartTypeId, tp_Assembly, tp_Class

You get back a result set that has a null for the tp_Assembly column for the web part. What is this web part you ask, well it is the Microsoft.Office.Server.Search.WebControls.SearchTopologyView web part
in the Microsoft.Office.Server.Search, Version=14.0.0.0,
Culture=neutral, PublicKeyToken=71e9bce111e9429c
assembly.

If you do a query to see where these 6 instance are:

select *
from AllWebParts
where tp_WebPartTypeId = '8D6034C4-A416-E535-281A-6B714894E1AA'

You will see that the web part exists on two pages:

  • SearchAdministration.aspx
  • SearchFarmDashboard.aspx

Open those pages, notice…It DOES exist!

Now, here is the funny thing – rerun the queries.  As soon as you open those pages, the databsae gets updated and the error will go away.  Weird!!!

Enjoy!
Chris