Saturday, January 05, 2013

I want my UFB

[rant]
OK, so we've just finished our first full month on Telecom's 500gig plan.  Yes, that's 500gig, half a terrabyte, or in technical terms, oodles of data. (I'm aware of whole schools that have less, infact the firm I work for doesn't have a lot more...)


Now I've been with Telecom/Xtra since the beginning of broadband, when my ASUS ADSL router hooked me direct into work so I could work from home two days a week when Amber was born (that's back in July 2001).
And dispite Telecom/Xtra's less than stellar reputation (it took them a year and a case of beer to get my billing right) and everyone telling me I should switch to ISP X, I've stuck it out.  Not that I haven't been tempted...
Over the years, various plans (more gigs or more speed on unbundled exchanges) have been very tempting, but Telecom/Xtra have consistently come back with something that's been "enough" to stop me switching.


Disclaimer: I don't use Xtra for email and aside from the billing woes of my first year, you can count the number of times I've had to call the Telecom/Xtra helpdesk (for Internet) on one hand and still have some spare fingers...

So there I was on Telecom/Xtra's 160Gig plan, looking longingly at various 200Gig and Unlimited (well, none are truely unlimited, they all have a variety of constraints/restrictions, many of which aren't spelled out so you just know you're walking into a world of hassle) plans, thinking maybe it's finally time to switch, especially given that where we live is not on the 3 year (err... only 2 years of that left now I think) UFB roll-out plan.  Then along comes the new Telecom/Xtra 500Gig plan, for $6/month less than I'm paying for their 160Gig plan (how does that work?). Leap!

So, you're probably all thinking "Do you REALLY need 500Gig? Surely 160Gig is more than enough?", after all, most of you are probably on far smaller plans and they are working just fine for you.  OK, so first up, we have a number of Internet connected devices (TV, iPhone, iPad, two iPod Touches, PS2, MagicTV (FreeView recorder), AppleTV, three laptops, a desktop PC and two servers. That's more than some people, but I know a fair number that have more. The updates alone put a dent in our monthly quota of bandwidth (for the geeks: I used to run a proxy to try and cache things, but it generated almost as many hassles as it solved), plus my girls are into YouTube and Ngaire loves posting photos and videos for her friends.  Oh and I download a fair bit too.

So we've been banging up against our 160Gig limit fairly freqently, popping over it a number of times, dispite having a little guage on my laptop that tells me how much we've used...

But with 500Gigs I no longer have to worry about my quota, I would have to try seriously hard to actually use that up.  And to prove it, my first month came in at 112Gig (note: we were away for a week, which would have decreased it).  So now I not only have oodles of quota (who needs "unlimited" when your quota is more than you can use?) but I don't feel I have to use it up.

But what has all this got to do with UFB?
As I've mentioned, our part of our street is not included in the initial 3 year UFB roll-out (dispite being in an afluent area, two blocks from a primary school and two blocks from an intermediate and the Telecom "box" is right at the end of our driveway) so I'm unlikely to get UFB anytime soon, so I'm looking to make the best of what speeds I can get (18Mbit/s down, 940Kbit/s up according to my router today).  According to my crude math, if I could saturate my connection continuously, it would take about 2.5 days to use up my 500gig plan! Now that seems quite quick.


UFB comes in two speeds:
  • 30Mbit/s down & 10Mbit/s up
  • 100Mbit/s down & 50Mbit/s up.
Orcon offer UFB plans in 30Gig, 60Gig and "unlimited" but their "unlimited" fair use policy is based on the average usage of their customer base - the wording is sufficiently thin that I suspect it's not the "average" of customers on their "unlimited" plan  and it possibly includes non UFB plans!, so your potential is being pulled down by cutomers on 30Gig plans.
 
So anyway, 30Gigs at 30/10Mbit/s - 2.3 hours.
60Gigs at 100/50Mbit/s - 54 minutes!
Even 500Gigs at 100/50Mbit/s is 7.5 hours.
 
OK, so sustaining that level of bandwidth is unlikley, but I'm just trying to show how small the quota is when compared with the bandwidth given. After all, what if a "hacker" decided to flood your connection, it wont take them long to push you into extra cost or back to dial-up stone-age speeds.
 
What's my point?
For UFB to "work" for home usage, a number of things need to exist, content (HD videon [Netflix, Hulu, YouTube, etc], internet radio [Spotify, etc] and HD streamed games being the most obvious and bandwidth intensive available currently), services (online photo/video editing? HD multi-user video chat [e.g. Skype], being some obvious ones) and consumers to consume it ("build it and they will come"?).


Now the likes of the USA have most these things to some degree or another (though often not available to those outside of the USA) and if you're prepared to pay extra in NZ, you can even get some of the premium video (Sky has all the TV/Movies pretty much tied up, so no sign of Hulu or Netflix here).

But the key catch in NZ is the quotas on our Internet plans. 30Gig = ~ 30 hours of average quality 720p video or about 6 hours of average quality 1080p video. So the Internet is not going to replace your TV/DVD/BluRay in any great hurry (yes, various ISPs have deals where iSky/YouTube/etc don't count against your monthly quota, but what if you get your video from somewhere else?).

So until consumers don't have to worry about going over their quota after watching only a handful of movies each month (or their kids watching a string of TV shows), I don't think the UFB story is sufficiently compelling for most homes.

However, if I can keep my 500Gig plan, I'd leap on the 100/50Mbit/s UFB, though I'd be back to watching my usage closely, but Ngaire will get a much better experience uploading her videos and photos and we should be able to watch multiple videos at the same time as downloads are going on.

Though it will be interesting to see how various NZ services and the Southern Cross Cable (NZ's primarily link to the rest of the world) cope with consumers having significantly more bandwidth.
[/rant]

Tuesday, October 30, 2012

Enumerating SharePoint 2007 user permissions

As part of another project, I recently needed to enumerate through all the sites, subwebs, lists and items to determine which users had been assigned what rights.

Originally I came across this post by Roger Cormier, which provided a great base, but had a few issues:
  1. It was SP2010 based (SP2007 doesn't have Get-SPWeb)
  2. It didn't handle Items
  3. It didn't handle sub-site/web/list/items of parent site/web/lists that didn't have unique permissions.
  4. It didn't handle membershipproviders
So after some mangling, I submitted this back to Roger.

I then distilled it down to the following script, which is cruder, but outputs a CSV that I can then use for various automated tasks.

Feel free to do what you like with it.

[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")
function stripProvider([string]$userName)
{
    if($userName.split("\").count -gt 1)
    {
        $userName.split("\")[1]
    }
    elseif ($userName.split(":").count -gt 1)
    {
        $userName.split(":")[1]
    }
    else
    {
        $userName
    }
}

#This function determines the source of the user AD/Local NT vs Membership provider
Function UserSource([string]$userName)
{
    if($userName.split("\").count -gt 1)
    {
        $userName.split("\")[0]
    }
    elseif ($userName.split(":").count -gt 1)
    {
        $userName.split(":")[0]
    }
    else
    {
        ""
    }
}


$farm = [Microsoft.SharePoint.Administration.SPFarm]::Local
$farmWebServices = $farm.Services | where -FilterScript {$_.GetType() -eq [Microsoft.SharePoint.Administration.SPWebService]}
foreach ($farmWebService in $farmWebServices) {
  foreach ($webApplication in $farmWebService.WebApplications) {
    foreach ($site in $webApplication.Sites)
    {
        foreach ($web in $site.AllWebs)
        {
          # Write-Host "Site Collection: ID:" $site.ID " - URL: " $site.Url
          if ($web.HasUniqueRoleAssignments)
          {
            foreach ($RoleAssignment in $aList.RoleAssignments)
            {
                if(UserSource($RoleAssignment.Member.LoginName) -ne "")
                {
                    "web,direct," + (stripProvider($RoleAssignment.Member.LoginName)) + "," + (UserSource($RoleAssignment.Member.LoginName)) + "," + $web.Url + "," + ($RoleAssignment.RoleDefinitionBindings | select name).name
                }
                else
                {
                    $allUsers = $Roleassignment.member.users
                               
                    #Perform some action against all members returned.
                    foreach($User in $AllUsers)
                    {
                        "web,role," + (stripProvider($User.LoginName)) + "," + (UserSource($user.LoginName)) + "," + $web.Url + "," + $RoleAssignment.member.name
                    }
                }
            }
          }
         
          foreach ($aList in $Web.lists)
          {
              if ($aList.HasUniqueRoleAssignments)
              {
                  foreach ($RoleAssignment in $aList.RoleAssignments)
                  {
                      if(UserSource($RoleAssignment.Member.LoginName) -ne "")
                      {
                          "list,direct," + (stripProvider($RoleAssignment.Member.LoginName)) + "," + (UserSource($RoleAssignment.Member.LoginName)) + "," + $web.Url + $aList.DefaultViewUrl + "," + ($RoleAssignment.RoleDefinitionBindings | select name).name
                      }
                      else
                      {
                          $allUsers = $Roleassignment.member.users
                         
                          #Perform some action against all members returned.
                          foreach($User in $AllUsers)
                          {
                              "list,role," + (stripProvider($User.LoginName)) + "," + (UserSource($user.LoginName)) + "," + $web.Url + $aList.DefaultViewUrl + "," + $RoleAssignment.member.name
                          }
                      }
                  }
              }
              foreach ($anItem in $aList.Items)
              {
                  if ($anItem.HasUniqueRoleAssignments)
                  {
                      foreach ($RoleAssignment in $anItem.RoleAssignments)
                      {
                          if(UserSource($RoleAssignment.Member.LoginName) -ne "")
                          {
                              "item,direct," + (stripProvider($RoleAssignment.Member.LoginName)) + "," + (UserSource($RoleAssignment.Member.LoginName)) + "," + $Web.Url + "/" + $anItem.URL + "," + ($RoleAssignment.RoleDefinitionBindings | select name).name
                          }
                          else
                          {
                              $allUsers = $Roleassignment.member.users
                         
                              #Perform some action against all members returned.
                              foreach($User in $AllUsers)
                              {
                                  "item,role," + (stripProvider($User.LoginName)) + "," + (UserSource($user.LoginName)) + "," + $Web.Url + "/" + $anItem.URL + "," + $RoleAssignment.member.name
                              }
                          }
                      }
                  }
              }
          }
       }
       $site.Dispose()
    }
  }
}

Wednesday, May 16, 2012

SharePoint on mobile: HTML vs Generic App vs Custom App

Here at work we've been working with SharePoint since mid 2006 (SP2007 beta) and most of what we do is based around SharePoint 2010 these days. So with the influx of smartphones and tablets (OK, to be honest, it's largely iPhones and iPads, with a very small sprinkle of Android and WM7 devices) we've been looking at what our corporate response is and how to combat the likes of DropBox.


Given that we already use SharePoint for our Intranet, Internet and Extranet sites, it seems fairly logical to find a way of leveraging it to deliver content, securely, to mobile devices.  Currently there are three main approaches to supporting mobile devices:

I'm not going to say which approach is best for you, as every situation is different. But I will say you need to look closely at what you're trying to achieve and how you expect your users to interact with your content.

For example, here's our WWW site in Safari on iOS 5.1.1 on my iPhone 4s:
It's nicely branded, looks basically the same as it does on a desktop browser.  The content is largely HTML based with the odd PDF.  It's also basically a read-only site as we use SharePoint's content deployment to push the content from an internal staging farm to the DMZ based external facing farm.

Now here's the same thing in Filament Lite:
OK, so we're now looking at the "raw" SharePoint content, with Sites and Lists. But no branding, no wazy animations and mouse-overs, no neat menu structure and no whole-site search (Filament Lite only supports searching within a list). Not such a great story for consuming our WWW site then.

On the other hand, what about something more collaborative.  Here's part of one of our Extranet sites in Safari, again on my iPhone 4s:
So, now we've got a more vanilla branding and the site is farm more out-of-the-box SharePoint.  We've got a bunch of document libraries, each with various Word, PDF, Excel, PowerPoint and Project files. Safari is more than happy to navigate the site and open/view most of the documents, it's pretty much a read-only experience (you could do something around download/upload, but that's quite cumbersome). Lists however have the full read-write capabilities thanks to being simple HTML forms and fairly standard bits of JavaScript and CSS.

OK, how about Filament Lite:
Yup, we've got our document libraries and lists.  It's more than happy to drill into these:
Unfortunately, the Lite version doesn't support editing list items or documents, but I know the full/paid versions of the SP Apps listed, often support editing and I've seen editing on the iPad equivalents.

So what about Custom Apps?  A custom app has the potential to provide both branding and target the functionality that your users need, but at what cost? And how many platforms do you need to support?

So that's my 2c on the matter currently, no real answers, just more questions :)
We're still looking for a solution that fits our needs and a process that is simple enough for our staff to follow...

I look forward to seeing what the App vendors come up with and what MS do with the next version of SharePoint...

Friday, June 03, 2011

And now for something completely different

Spied KiwiRails new loco 9135, being dropped on the rails for the first time.  It's HUGE!

Monday, April 04, 2011

Microsoft Releases AD FS 2.0 Capacity Planning Spreadsheet - hillarity ensues...

OK, so it's been a while since I last blogged, various excuses are the reason...

But today I was tracking new downloads from Microsoft, we use their products a lot here at work and that can be both good and bad.

I'm involved in a project that will be using AD FS 2.0 and SharePoint 2010, so when I saw that MS had released a Capacity Planning Spreadsheet for AD FS 2.0, I ran off and downloaded it. Plugged in our numbers (we have about 450 staff and over 2000 Extranet users) and this is what I saw:
That's right, 0.01 AD FS servers are reccomended (actually, the first time I ran it, I got 0.00!).
Now, we're not a small firm by NZ standards, but we're certainly not an "Enterprise" by USA standards, either.

So why does it come up with such a small number.  Flip the sheet on the spreadsheet and you get MS's server sizes, as tested:
  1. Federation Server
    • Dual Quad Core 2.27GHz (8 cores)
    • 16GB RAM
    • Windows Server 2008 R2, Enterprise Edition
    • Gigabit Network
  2. Federation Proxy Server
    • Quad Core 2.24GHz (4 cores)
    • 4GB RAM
    • Windows Server 2008 R2, Enterprise Edition
    • Gigabit Network
[ouch!]

Which they qualify with:
"Capacity recommendations for AD FS 2.0 servers can vary considerably, depending on the specifications you choose for the hardware and network configuration used in a given environment. As a point of reference, the sizing guidance provided in this content is based on a utilization target of 80% on the computers specified above.



** Memory and disk space requirements for federation servers are modest, and they are not likely to be a driving factor in hardware decisions. The estimates contained in the AD FS capacity planning sizing spreadsheet can be used to estimate the recommended number of federation servers with more moderate memory specifications, such as 4 GB."



Well thank goodness for that! My original calculations had their base-line hardware at 70,000 times more powerful than was necessary to support us!
 
Let it not be said that MS doesn't support/promote hardware vendors :)
 
All joking aside, a more useful (if somewhat more complex) calculator would have given variances in hardware (or VM) specs, to cater for those of us that don't run 40,000 users through ADFS against 6 Claims apps. Yes, that's what it takes to require ONE ADFS server....

Friday, June 04, 2010

SharePoint 2010 RTM - Least Privilege Install - My Experience

  • WARNING [11/11/2010]: It looks like Google's update to the template and rich text editor has mangled my post, so watch out for instructions that don't make sense or are out of order.... I need to replace this whole article now

  • Update [5/10/2010]: I've actually switched to a script based install, which appears to get around some of the problems that I'm seeing (I think User Profile Syncing is working, nearly!), but there's still a lot of manual UI config to be done and some service applications can't have their DB specified in PowerShell.  Look for a new article detailing this soon!
  • Update [5/10/2010]: Added new bullet point for enabling the Diagnotstic Data Provider: Performance Counters - Database Servers timer job to connect to the SQL server when using SQL Aliases.
  • Update [9/07/2010]: Added new bullet point for enabling the Managed Metadata feature for sites.
  • Update [30/06/2010]: Added new bullet point for Developer (Visual Studio 2010) access.
  • Update [16/06/2010]: Added fix for browser language/locale issue in phonetic search. Applies to SharePoint Search and FAST.



Preamble:



OK, so we've been using SharePoint 2007 here at work since it came out (actually we started while it was still in Beta) and we've been somewhat gagging at the bit for SharePoint 2010 and the promises it's made. Some we've seen ourselves, as I installed first the Beta2 and then the RC, with somewhat mixed results. (Basically next, next, next, finish installs, as that's all the advice that was available at the time.)



Now things have changed a little, the RTM is out and next, next, next just doesn't cut it. It's fine for quickly throwing up a primitive VM, but when you're trying to learn the product and install in a manor that will actually teach you what's going to happen when you try to install into your Production environment, then it's the hard way, all the way.



What follows is the guts of my install process, some of the gotcha's I hit along the way and how I resolved them. It also skips various bits, some because I'm not going to tell you how to press Next and others because I just haven't installed those parts, yet.



It's by no means perfect, I have a small cluster of errors turning up in my logs, some I know what they're from but haven't found a solution, others are completely unknown.



So please use the comments to send me feedback, particularly on what I've completely munged...



Oh and you wont find a boat load of screenshots or lists of commands. If you need screenshots to install something, either it's too complex, or you shouldn't be installing it... :) And long lists of commands don't really offer anything useful other than polluting search results...



Eventually I'll extend this with bits on FAST (which I think I've managed to install with virtually no pain) and playing with the BCS (as this is going to be important for us in the future).



So here goes...



  1. Pre-installs
    OK, I'm assuming you've got something like Windows Server 2008R2 x64 based VMs. I ended up with three:
    • SPS01 - 1 CPU, 4Gig RAM, 40Gig C:
    • SQL01 - 1 CPU, 4Gig RAM, 40Gig C: (sys), 30Gig E: (data), 20Gig F: (logs)
    • FAST01 - 2 CPU, 4Gig RAM, 50Gig C:


      Install something like SQL2008R2 x64 on the SQL box and configure it up (engine, AS, RS, Agent & FullText).
      Install the SQL2008R2 x64 client on the SharePoint and Fast boxes.
       Bonus SQL step:
      On the SQL server, give the Local SQLServerMSSQLUser... group, Local Launch permission on MsDtsServer100 in Component Services (DCOM) to allow error free overnight maintenance scripts.
  2. Extra Bonus SQL step:
    If you use SQL Aliases (and currently I can't see the point, despite it being "best practice", DNS aliases/CNAMEs seem to make far more sense), ensure you have a DNS alias/CNAME that does the same mapping, to allow the Diagnostic Data Provider: Performance Counters - Database Servers timer job to establish an RPC connection to the SQL server.



    Microsoft have a couple of other pointers HERE
  3. Follow THESE instructions to set up a SharePoint AD Container (optional, it's for tracking SP2010 installs on your Domain/Forrest)
  4. Do a full install from the SP2010 media, we're licenced for Enterprise edition and this is the only time I did pretty much a "full" (Server Farm/Complete) next, next... type install. Use somthing like THESE instructions to guide you.
    I installed using my personal Admin level account (domain admin), a SQLService account (used to run the SQL processes on the SQL box, standard SQL type stuff) and a Farm Account/SQL Access account, which at this point is my "Swiss Army" account, as if you stop using it, things start to break... The Farm Account had special privs in SQL and is a local admin on the SharePoint box. The latter is only for the duration of the install/setup (you'll see why later when you hit the User Profile Service) and isn't mentioned in those instructions.
  5. Run the SharePoint Products Configuration Wizard (or just let it run at the end of the install)
  6. OK, now this is where Next, Next, Next... stops. Feel free to join the User Experience Program (it's up to you), but I'm trying not to use the Farm Configuration Wizard...
  7. Now is a good time to open up some ports on the Windows Firewall on your SP box, as your Central Admin port is unlikely to be available other than locally (you might as well ensure any other ports you intend to use are open too)
  8. Now is a good time to set up some more DCOM permissions (reminds me of SP2007....)
    • The Farm Account will need Local Activation permissions on both IIS WAMREG and 000C101C-0000-0000-C000-000000000046 (MSIServer)
      Remember, that's back in Component Services DCOM
  9. Now I set up a beginning set of Service Applications.
    I've recently seen suggestions that the "Services on Server" services should be started before the Service Applications are created/configured, but I saw this afterwards...

    First up is the Secure Store Service:
  10. Next up is the Usage and Health Collection Service Application
    • open an Administrative SharePoint PowerShell and execute something like:

      New-SPUsageApplication -Name "[servicename]" -DatabaseName [usagedbname] -DatabaseServer [dbserveralias]

      This creates the service and it's proxy, but for me it leaves the proxy stopped and I haven't investigated far enough to figure out why.
  11. Next up is the State Service Application
    • in the PowerShell you left open from step 10 (you didn't close it right?) execute something like the following:

      New-SPStateServiceDatabase -Name [statedbname]-DatabaseServer [dbserveralias]

      New-SPStateServiceApplication -Name "[servicename]" -Database [samestatedbname]

      New-SPStateServiceApplicationProxy -ServiceApplication "[sameservicename]" -Name "[sameservicename] Proxy" -DefaultProxyGroup


      This creates the State Service DB, assigns it to a new State Service and then assigns the State Service to a new State Service Proxy.
  12. Next up is the Session State Service
    • again in your PowerShell console:

      Enable-SPSessionStateService -DatabaseName [sessiondbname] -DatabaseServer [dbserveralias]
      This creates the Session DB and assigns it to a new Session State Service Application, which is automatically called "SharePoint Server ASP.NET Session State Service"
  13. Now for the Web Analytics Service Application
  14. Time for the Managed Metadata Service
    Note: my MMS seems to have a bit of a bumpy life, often reporting as not available, but we've been having disk issues in our SAN which _might_ be the cause.
  15. Time for the Search Service Application:
  16. Might as well configure up the SharePoint Foundation Help Search Service:
  17. About now the SP server event logs will start complaining about no Cache Super User account for the Central admin, so run:
    stsadm -o setproperty -propertyname portalsuperuseraccount -propertyvalue [lowprivdomainaccount] -url [urlofsite]
  18. Fix the TaxonomyPicker Control issue:
  19. Start the Application Registry Service and Claims to Windows Token Service:
  20. Start the Microsoft SharePoint Foundation Subscription Settings Service and the Microsoft SharePoint Foundation Sandboxed Code Service
  21. At about this point I create my two primary Web Apps:
  22. Create local profiles for App Pool accounts:
    • IIS7/Win2008R2 seems to like App Pool accounts to have local profiles (otherwise it assigns a temp one each time an App Pool starts), so you can create one using the instructions HERE
  23. The Site Directory feature is turned off in SP2010 by default (to avoid migration/upgrade conflicts). I'm not attempting to migrate right now, so I've turned it on using THESE instructions.
    • Make sure the Publishing Feature is turned on at the Site Collection level and probably about time to check that the Standard SharePoint Features are turned on too
  24. OK, here's where the real fun begins (or at least the most issues...), yes, it's time to set up the User Profile Service:Configure and start the Document Conversion Load Balancer service on a memorable port (I've pretty much used a sequential set of ports for services I have control over)
    • Ensure MSIServer has Local Activation for Network Service in Component Services (may need to assume ownership and add full control to regkey HKCR/AppID/...)
    • Ensure MSDTC has network incoming permissions (in Component Services)
    • Follow the instructions HERE as they are awesome and work!
    • If your NETBIOS AD Domain name is different from your FQDN, then use that bit of the instructions HERE. Here at CT we need to do this.
    • I'm still getting what look like UPS related errors:
      • Product: Microsoft User Profiles -- Error 1706. An installation package for the product Microsoft User Profiles cannot be found. Try the installation again using a valid copy of the installation package 'pplwfe.msi'.
        My attempt to fix this consists of finding pplwfe.msi on the SharePoint2010 install media and running it manually. Now I have to wait and see if that has fixed it.
        Note: After a reboot I started getting a "trial expired" error on all Web Apps except the Central Admin so had to re-run the Products Configuration Wizard.
      • Detection of product '{90140000-104C-0000-1000-0000000FF1CE}', feature 'PeopleILM' failed during request for component '{1681AE41-ADA8-4B70-BC11-98A5A4EDD046}'and
        Detection of product '{90140000-104C-0000-1000-0000000FF1CE}', feature 'PeopleILM', component '{1C12B6E6-898C-4D58-9774-AAAFBDFE273C}' failed. The resource 'C:\Program Files\Microsoft Office Servers\14.0\Service\Microsoft.ResourceManagement.Service.exe' does not exist.
        I'm guessing that these two are related. So far, general concensus on the net is that a "proper" install of UPS makes these go away. However, aside from using the Wizard, I've not managed to make them go away...
  25. Configure and start the Document Conversion Launcher Service, point it at the Load Balancer in step 25.
  26. At this point the Health Analyzer will be screaming at you to stop using Local Service accounts and to stop using the Farm Account.
    I still need to work my way through these, ensuring that things don't break when I change the account. I think I'll use a single domain based "Services" account to run most of it, which has no privs, other than what SharePoint assigns it.
  27. Enable the BlobCache as per MS's Instructions for each WebApp (aside: looks like MS have rearchitected the BLOBCache for 2010)
  28. Now for some Optional Bits:
    Install FAST:
    • We're licensed to use FAST and quite keen to extend it into some of our LOB systems.
    • THESE instructions from MS are quite good (and don't forget to install WebApps on the SP server first!)
    • But expect to hit THIS problem, so use a completely unheard of account format ([domainfqdn]\[fastserviceaccount], e.g. microsoft.com\FASTAccount) Note: you'll also hit this issue when configuring the Click-Through-Relevancy settings.

    I'm currently seeing one FAST related error:
    • On the SharePoint side from Schannel [36888] Error: Error: The following fatal alert was generated: 10. The internal error state is 10.
    • On the FAST side from Schannel [36885] Warning: When asking for client authentication, this server sends a list of trusted certificate authorities to the client. The client uses this list to choose a client certificate that is trusted by the server. Currently, this server trusts so many certificate authorities that the list has grown too long. This list has thus been truncated. The administrator of this machine should review the certificate authorities trusted for client authentication and remove those that do not really need to be trusted.

    Though this seems minor and unaviodable in this day and age (number of Root CAs installed on Windows by default)
  29. Set up PDF indexing for normal SharePoint Search (if you're not using FAST, or in my case, I'm using both, for direct comparison) using THESE instructions
  30. If you want to add Managed Metadata Columns to your sites, don't forget to MANUALLY enable the feature:

    STSADM -o activatefeature -id 73EF14B1-13A9-416b-A9B5-ECECA2B0604C -url http://<server> -force
  31. Developer Access:

    Visual Studio 2010 users will need to:
    1. Run VS2010 as Administrator and
    2. have dbo level access to the Web App Content DBs
    3. have dbo level access to the Farm Config DB
    • Main Content (don't forget to do the Cache Super User account thing as per step 17)
      • Generally I create blank publishing site collection for the root
    • MySites (as with Main Content, do the Cache Super User thing)
      • I'm currently using a separate port (http://sphost:81/) to pick out my MySites, as this is where we ended up with SP2007. Before going production, I'd like to change this to use host headers and a separate DNS name.
      • Don't forget to create a "My Sites Host" site collection
    • In the past I've used separate App Pools for my Main Content and MySite Web Apps, but general advice out there at the moment seem to be to use the same one. I suspect I'll keep them separate in production.
    • Yes, I'm using a low priv Domain Account for the App Pool
    • I suspect I need to use the wizard (or a command line) to create the Application Registry Service correctly, but haven't gone down that path yet
    • May hit THIS or THIS issue
    • Don't change the service account (for Claims to Windows Token Service) from Local System, as this will cause more problems. However this means that the Health Reports will reporting a local system account in use... [false positive]
    • According to THIS website you need to fix an HTML/XML encoding error in the file
    • Note: replace & #44; not &#44 (remember the semicolon)
    • I found this didn't wrok (error still turns up after App Pool recycle or IISRESET), so ending up renaming the TaxonomyPicker.ascx file as apparently it's not used anymore.
    • Note: running PSConfig seems to re-create this file...
    • Pretty much default config. I'm currently using the Farm Account for the App Pool and a low priv account for crawling/content access.
    • Start the service once you've configured it.
    • Ensure the crawling account has permission to crawl MySites as per THIS article
    • Note: We hit THIS problem, which is where the SharePoint Core Search Results (and same for FAST) don't recognise the en-NZ locale from the browser (yes, we're in New Zealand, yes we speak English), as opposed to en-US. So you need to go into the search results page (for SharePoint search and/or FAST search) and force the language to English. I'm guessing that the American programmers forgot that en, en-UK, en-AU, en-NZ, en-US, etc are more or less the same (ahh, though there are spelling differences of course and I bet that the English setting is en-US!). This is all well and good, but the People Matches webpart doesn't have the same language setting, so you have to go to the People scope to get phonetic/spell-checked people results... [sigh]
    • I've found that I need to give the Farm Account full control of regkey:
      HKLM\SYSTEM\CurrentControlSet\services\VSS\Diag
      to prevent some errors turning up in the Event Log.
    • you may need to navigate to the searchadministration page to ensure that some controls get registered properly, as per THIS and THIS article
    • again, use the Farm Account for the App Pool
    • and don't for get to start the Managed Metadata Web Service in the Services on Server page
    • IISRESET
    • ensure the Farm Account as sufficient permissions to the MMS as per THESE instructions, though they should only apply if you DON'T use the Farm Account... but not entirely sure...
    • it'll pay to navigate to the MMS to ensure that it's running OK, perhaps create a Term Store and some Terms while you're there...
    • Give it a nice name
    • create a new app pool based on the Farm Account
    • give the Staging and Reporting DBs nice names
    • read the dialog at the end and follow it closely:
      • to start the two Web Analytics services in "Services on Server"
      • configure and enable the usage schedules (optionally including DB performance) from the Monitoring Review job definitions page
      • add the Farm Account to the Performance Monitor Users local group on the SQL server
    • IISRESET
    • create a new Secure Store Service Application on the Central Admin Application Management Manage Service Applications page
    • time to point out that the Nav in Central Admin really bites...
    • give it a nice name and give the DB a nice name too
    • use the Farm Account for the App Pool
    • start the "Secure Store Service" on the Services on Server page.
    • Do an IISRESET in an Administrator CMD console
    • Don't forget to create a new Key from the new Secure Store Service Application on the Manage Service Applications page
    • Create a New Farm
    • Use a SQL Server Alias (became a best practice for SP2007 too late for me to impliment, so now I'm making sure I use them for Sp2010. Don't like the way it's all done with Registry Keys though...)
    • Use a good naming scheme for naming your DBs. Unless you live in PowerScript land, you won't be able to choose the names of all your DBs, but the least you can do is logically name the ones you do have control over.
    • I use a consistent port for all our Central Admins and tend to stick with NTLM for them too (we have Kerberos working with our Intranet, primarily to support SSRS)
And that's it for now. That's as far as I've gotten. My main goal at the moment is to hook FAST up to content and get that running slick. Secondary goals are to get rid of any errors turning up in the event logs and to set up the remaining services (Excel, Performance Point, PowerPoint, Access, etc)
If you use these instructions and hit any problems, please do let me know, or if you've seen any of the same problems as me, but resolved, them, please drop me a comment.
Later'ish
Craig

Wednesday, May 12, 2010

Informal Telecom HSPA+ performance test

I recently recieved an HSPA+ USB stick from Telecom/Sierra Wireless, without a SIM/plan. So being the geek I am, pulled the SIM from my HTC Touch Pro 2 and dropped it in the USB stick.

And then proceeded to do the very unscientific test below using
SpeedTest.net, comparing it with my aging Telecom CDMA Rev A card and the built-in Wifi (802.11g) on my laptop via my Telecom/Xtra BigTime plan, during what is probably peak time.

Some interesting things to note:
  • The HSPA+ (XT) network and ADSL2+ (Xtra) both appear to have POPs in Auckland (SpeedTest.net picks Auckland as the nearest/default server), but the CDMA network seems to have it's POP in Wellington
  • Ping times were all over the place and probably fairly reflective of the fact that this was done during peak hours
  • For some reason (and I've seen this before) my ADSL gets great ping-times to Wellington (thanks CityLink!) and poor download speed from Auckland
  • My ADSL2+ is connecting at 11.5/0.94Mbit/s
    Yes, I'm lucky enough to be in an ADSL2+ area as you pay the same price even if your connection is slower
  • Both the HSPA+ and CDMA were showing full signal strength
  • While HSPA+ didn't get anywhere near its 21Mbit/s theoretical limit (and there can't be that many people in my cell who are using it yet right?) it does nock the socks off CDMA Rev A (pitty I don't have a non + HSPA device to compare with)
  • Cost aside, HSPA+ looks like a viable alternative to current average ADSL speeds
  • Pricing (no discounts included)
    • Mobile Broadband (HSPA+) - $59.95/month for 2Gig plan or $29.95 for 500Meg prepaid
    • Xtra BigTime Broadband (ADSL2+) - $59.95/month for unlimited (throttled) plan
    • Mobile Broadband (CDMA Rev A) - unknown (it's on a corporate plan)
  • It would appear that I'm not even getting close to HSPA+'s potential (I thought it might be becuase it was still locked down to HSPA speeds), but the guys on GeekZone.co.nz have been reporting speeds between 16 and 21Mb/s!


Excuse the gap, it's something that Blogger.com seems to do to HTML tables, really should learn more CSS...





































Remuera, Auckland to:

Sierra Wireless HSPA+ 308 USB

Telecom CDMA Rev A PCMCIA

Telecom ADSL2+
(BigTime & WiFi G)

Auckland







Tauranga







Napier







Wellington







Sydney