Friday, December 30, 2011

Lync User Adoption And Training Kit

The team at Microsoft has made a great resource available for companies planning to roll out Lync within their organization. http://lync.microsoft.com/adoption-and-training-kit

Here you will find ready to use training material for end users, a pre made help site which you can make available in your intranet, a great overview of who needs what level of training and lots of useful resources and tools. Now, creating a Lync Adoption Plan for your organization has become a matter of hours and not days/weeks.

Wednesday, December 7, 2011

Trend Analysis of Lync Servers using Reporting Services Snapshots

So you have a Lync farm and have monitoring set up. You keep the last 6 weeks worth of data for those reports but what about last year? How can you easily compare the usage and network statistics that the OOB Lync Reports give you effectively and efficiently?
SQL Server Reporting Services Snapshots could be the answer to your question. To enable Snapshots and start tracking usage over time you need to do following things:
1) un-hide the individual reports in Report Manager
2) create a snapshot schedule
3) subscribe to the snapshots so you don't forget to look at the data!


By default only the homepage and the dashboard can be viewed from the Report Manager url (http://your-report-server/Reports). At the root of that url you will find a folder called "Lync Server Reports". Within that folder is the Reports Home Page. This is the main dashboard that opens up directly when you open the reports from the Lync Control Panel. At the same location you will find a sub folder called "Reports Content" This is actually where all the juicy sub-reports are hidden. Yes. HIDDEN. But you can still see them by clicking the Show Details button found at the top right of the report manager. From there you can then unhide reports, set up snapshot schedules and even subscribe to the snapshots to easily stay informed about your general lync farm health.



First go to the Report Manager Url which is the fqdn of your report server followed by /Reports




 Then drill down to the Reports Content Page with the Show Details button at the top right. Just two Datasources and the Monitoring Dashboard are visible. Click on the


Click on the edit properties button of the report to change the visibility status.



That's the magic tick box to make the reports easily visible in the report manager again.



Click the History link of the Report to set up a snapshot schedule



From here you can run a snapshot at a regular interval



And finally add a subscription based on the snapshot so you get an email and can look at performance related data on a regular basis! (If email is not an option, check smtp settings of reporting server)


A couple of tips along the way:
  • Set up different schedules for different types of reports
  • Set up Subscriptions only for reports that you are planning on reading
  • Make sure your Reporting Services Server is properly configured and has a valid SMTP server address associated with it. (Report Server Configuration Tool on the Reporting Server). Otherwise the subscriptions page will not show email as an option!


Tuesday, September 20, 2011

Hands on Labs for SharePoint and Lync

[Update 7/12/2011: Found two more Lync Labs and added them to the list]

Often when I teach a course students feel overwhelmed by the sheer amount of information bombarding them in just five days. Obviously the best medicine is to go over the books when they have had a chance to breathe and digest all the great new things they have learnt. But the best way to learn is not to read about it but do it! No, I can't give my students copies of the lab servers to play with at home. And no, you should not repeat the exercises on your production SharePoint farms. Luckily there is a simple answer to this dilemma. It is called Virtual Labs.
The boys and girls at Microsoft have put a tremendous amount of effort and resources into creating lab environments which anybody can use to gain invaluable skills with the SharePoint products. The labs are split into two main audience groups. Administrators and Developers. While the administrators will be able to get their juice from Technet the developers can get their labs from MSDN. Below is a compilation of labs which I think you will find quite useful. (Lync labs at the end of the article)


TechNet Virtual Labs for SharePoint 2010 Administrators

  • TechNet Virtual Lab: Backup and Restore in SharePoint Server 2010 - After completing this lab, you will be better able to use farm-level backup and restore features in Microsoft® SharePoint® Server 2010, use granular backup and content recovery tools in SharePoint Server 2010, and use Microsoft Windows® PowerShellTM to restore sites and lists in SharePoint Server 2010
  • TechNet Virtual Lab: Business Connectivity Services - After completing this lab, you will be better able to create an External Content Type based on a back end database and view and manipulate backend data in External Lists
  • TechNet Virtual Lab: Business Continuity Management in SharePoint Server 2010 - After completing this lab, you will be better able to configure and use the versioning features in Microsoft® SharePoint® Server 2010, configure and use the SharePoint Server 2010 Recycle Bins, and use Microsoft Windows® PowerShellTM cmdlets to perform backup and restore operations in the SharePoint 2010 Management Shell
  • TechNet Virtual Lab: Configuring Remote Blob Storage FILESTREAM Provider for SharePoint Server 2010- After completing this lab, you will be better able to enable FILESTREAM on the appropriate SQL Server database, provision the RBS data store, install the Remote Blob Storage (RBS) FILESTREAM Provider, enable the RBS FILESTREAM Provider on the appropriate content database, and configure the RBS FILESTREAM Provider scenario
  • TechNet Virtual Lab: Configuring Tenant Administration on SharePoint Server 2010 - After completing this lab, you will be better able to create a new Tenant Administration site collection, manage site collections through Tenant Administration, and create a partitioned service application
  • TechNet Virtual Lab: Configuring User Profile Synchronization in SharePoint Server 2010 - After completing this lab, you will be better able to configure the User Profile Synchronization Service in Microsoft® SharePoint® Server 2010, start the User Profile Synchronization Service, create a New Profile Synchronization Connection, edit Profile Synchronization Connection Filters, map User Profile Properties, and configure Profile Synchronization Settings
  • TechNet Virtual Lab: Enterprise Search - After completing this lab, you will be better able to create a new content source for SharePoint to crawl, exclude certain results from being crawled by search, create a Search Center, and customize and extend the user interface
  • TechNet Virtual Lab: Introduction to Microsoft SharePoint Server 2010 Upgrade - After completing this lab, you will be better able to verify existing 2007 farm and content upgrade readiness through the use of the 2007 pre-upgrade checker command, verify existing 2010 farm readiness to upgrade specific content databases using the 2010 Test-SPContentDatabase cmdlet, initiate upgrade for individual content databases using the 2010 STSADM -o addcontentdb command, review the upgrade session status using the improved Central Administration web site Upgrade Status page, initiate upgrade for multiple individual content databases using multiple PowerShell sessions to trigger parallel upgrade sessions, troubleshoot an upgrade failure due to missing features, and how to restart upgrade for individual content database, and use Visual Upgrade features to switch sites from the 2007 product look and feel to the new 2010 product user interface
  • TechNet Virtual Lab: New IT Pro Features in SharePoint Server 2010 - After completing this lab, you will be better able to find your way around the Microsoft® SharePoint® Server 2010 Central Administration Web site, use basic site management tools in SharePoint Server 2010, describe the health monitoring and Web analytics capabilities of SharePoint Server 2010, provide and consume SharePoint Server 2010 service applications, and use commands from the Microsoft Windows® PowerShellTM command-line interface in the SharePoint 2010 Management Shell
  • TechNet Virtual Lab: Performance Management - After completing this lab, you will be better able to set limitations on the number of list items returned at a time, understand how large list limits affect users, configure Resource Throttling, and understand how Resource Throttling and HTTP Request Monitoring and Throttling affect SharePoint performance and user experience
  • TechNet Virtual Lab: PowerShell and SharePoint 2010 - After completing this lab, you will be better able to find your way around the SharePoint 2010 Management Shell and interact with SharePoint Web applications, site collections, and sites, use Windows PowerShell scripting techniques such as pipes, filters, wildcards, and enumerations for SharePoint Server 2010 administration. You will also be better able to explain how to create and assign variables and use the SharePoint object model from Windows PowerShell
  • TechNet Virtual Lab: SharePoint Designer for IT Pros - After completing this lab, you will be better able to create and modify lists on the SharePoint site, create and modify workflows, and save SharePoint site as reusable template
  • TechNet Virtual Lab: SharePoint RTM - IT PRO - Business Continuity Management - After completing this lab, you will be better able to navigate through an unattached SharePoint Content Database, export content from an unattached SharePoint Content Database, and import previously exported content
  • TechNet Virtual Lab: SharePoint RTM - IT PRO - Installing and Configuring - After completing this lab, you will be better able to successfully install and configure SharePoint Server 2010, create a Managed Account in Central Administration, create a Web Application in Central Administration, and also create a Site Collection within the Web Application
  • TechNet Virtual Lab: SharePoint RTM - IT PRO - Service Applications - After completing this lab, you will be better able to configure the new Managed Metadata Service Application, associate the Managed Metadata Service Application with a web application, manage the Metadata Service by adding your own custom groups and term sets, import a group into the Enterprise Term Store within the metadata service, utilize the Managed Metadata Service Application within a list, configure My Site settings, and create a My Site
  • TechNet Virtual Lab: SharePoint RTM - IT PRO - Upgrade - After completing this lab, you will be better able to verify existing 2007 farm and content upgrade readiness through the use of the 2007 pre-upgrade checker command, verify existing 2010 farm readiness to upgrade specific content databases using the 2010 Test-SPContentDatabase cmdlet, initiate upgrade for individual content databases using the 2010 STSADM -o addcontentdb command, review the upgrade session status using the improved Central Administration web site Upgrade Status page, initiate upgrade for multiple individual content databases using multiple PowerShell sessions to trigger parallel upgrade sessions, troubleshoot an upgrade failure due to missing features and know how to restart upgrade for individual content database, and use Visual Upgrade features to switch sites from the 2007 product look and feel to the new 2010 product user interface
  • TechNet Virtual Lab: Windows PowerShell in SharePoint Server 2010 - After completing this lab, you will be better able to find your way around the SharePoint 2010 Management Shell and interact with SharePoint Web applications, site collections, and sites, use Windows PowerShell scripting techniques, such as pipes, filters, wildcards, and enumerations, for SharePoint Server 2010 administration, and explain how to create and assign variables and use the SharePoint object model from Windows PowerShell

SharePoint Server 2010 Virtual Labs for Developers

  • MSDN Virtual Lab: Client Object Model - After completing this lab, you will be better able to retrieve lists, print a list, and use ADO.NET data services
  • MSDN Virtual Lab: Customizing MySites - In this lab you will work with some of the new events capabilities in SharePoint Server 2010 as well as the new Visual Studio 2010 SharePoint Tools. You will customize public my site by adding new public page, which can host any additional services exposed for my site end users, create stapled features to customize structures created in personal my site using Visual Studio 2010, and also create delegation control to customize top navigation and web parts, which exists by default in the personal my site
  • MSDN Virtual Lab: Designing Lists and Schemas - In this lab you will work with some of the new events capabilities in Windows SharePoint Services 14 as well as the new Visual Studio 2010 SharePoint Tools. You will create a custom list definition, template and instance using Visual Studio 2010, implement referential integrity between two SharePoint lists so that items in one list cannot be deleted until referenced items in a child list are removed first, and create a synchronous event receiver that is triggered when new task is created
  • MSDN Virtual Lab: Developing a BCS External Content Type with Visual Studio 2010 - After completing this lab, you will be better able to build a BCS External content type, create a Business Data Catalog Model project, configure the External Content Type for offline use, and open the list using Outlook
  • MSDN Virtual Lab: Developing a Sandboxed Solution with Web Parts - In this lab you will construct a basic Web Part that will call into the SharePoint API to retrieve some information. Next it will try and use SPSecurity to try to elevate privileges. The third and last action that is added is an attempt to initiate a HTTP connection to an external site
  • MSDN Virtual Lab: Developing a Visual Web Part in Visual Studio 2010 - After completing this lab, you will be better able to work with existing Web Parts and Linq and also you will be more familiar with connecting two web parts
  • MSDN Virtual Lab: Developing Business Intelligence Applications - After completing this lab, you will be better able to use the Chart Web Part to create graphical representations of data within SharePoint lists, use Microsoft Excel 2010 to examine and data from SQL Server Analysis Services, and publish an Excel workbook with Excel Services to make it accessible to users using a browser. You will also be able to work with a PerformancePoint Services site and the new Dashboard Designer
  • MSDN Virtual Lab: Enterprise Content Management - In this lab you will work with some of the new capabilities added to SharePoint Server 2010 in the area of Enterprise Content Management. You will verify the configuration of an Managed Metadata in a SharePoint Web application, customize the Managed Metadata term store and leverage it within an existing SharePoint site, and implement document sets
  • MSDN Virtual Lab: Getting Started with SharePoint 2010 - In this lab you will begin your work with SharePoint 2010 and become familiar with the Virtual Machine (VM) that you will be using. You will get experience working with the SharePoint 2010 Central Administration site as well as working with a standard team site. This will allow you to experience the new user interface concepts introduced in SharePoint 2010 such as the server-side ribbon and in-place item editing. You will also get a chance to write and test C# code using the SharePoint Foundation 2010 object model
  • MSDN Virtual Lab: LINQ to SharePoint 2010 - In this lab you will create lists for use with LINQ, and create a web part for accessing the list data using LINQ
  • MSDN Virtual Lab: SharePoint 2010 User Interface Advancements - After completing this lab, you will be better able to create and customize SharePoint 2010 lists for storing ideas for new toys, and use various new features of SharePoint, SharePoint Designer and InfoPath
  • MSDN Virtual Lab: Visual Studio SharePoint Tools - In the lab you will become familiar with the standard project structure used by SharePoint Tools, create and test a project that contains a Feature, a Feature Receiver and a Web Part, configure SharePoint Tools deployment options, and debug a SharePoint Tools project by single-stepping through the code in your solution
  • MSDN Virtual Lab: Workflow - After completing this lab, you will be better able to use Visio to create the high level process, export the Visio model to SharePoint Designer 2010, and use SharePoint Designer 2010 to complete the detail. You will also be able to export from SharePoint Designer 2010 to a WSP file and import the WSP file into Visual Studio 2010

Virtual Labs for Lync 2010 Developers


TechNet Virtual Labs for Lync 2010 Administrators


Monday, September 19, 2011

Using Office 365 in an Extranet Scenario

This is something that has been one of the most exciting aspects of Office 365 for me. The fast and effective way of creating an Extranet scenario without the need to worry about firewalls, external user accounts or certificates.
When you sign up with Office 365 on an Enterprise plan you get access to your SharePoint Admin control panel. (Sorry folks, the Small business version aka Professional Plan does not have this feature)
There you can create several site collections and even allow external user which have not been set up in your AD or on Office 365 to access your SharePoint sites.
This is great news for companies who work closely with clients and want to use the SharePoint Online part of Office 365 as a information sharing hub.
When you open your first SharePoint team site at :https://youcompanyname.sharepoint.com" as administrator you will notice a new Site Action menu item called "Share Site". This is where you can add users as Visitor or Member to your site or sub site. You will also notice that this does not seem to work for external users straight away. You first need to change two settings for externals to gain access.

  1. On the SharePoint Administration page accessible via https://portal.microsoftonline.com/admin/default.aspx you will find a button that says Settings. Use this button to enable external access for your whole account. 
  2. Then go to the Site Settings of your private site collection via the Site Actions menu and activate the External User Invitations feature under Site Collection Features.

That's it folks. From now on you can add any email address to the Share Site dialog and that person will receive an invitation email to connect to your SharePoint site.
Using what username and password you might ask? Windows Live ID solves this problem for us. This part of Office 365 integrates via federation with Windows Live ID and so the external user can either use their existing Windows Live ID or sign up for a new one.
Once the user has signed in with their Live ID, they will appear in the group that you have added them to. Not any sooner though. They will also appear with their Live ID email which might be different from the email address you sent the invitation to. So before you get a heart attack thinking you got hacked by some random person, double check the email if it does not belong to someone you know who favours cryptic email addresses.

Happy Sharing!

Thursday, September 1, 2011

Call Admission Control (CAC) not working in Lync?

The best tool to test if Call Admission Control (CAC) is actually being used once you have activated it is to use the Logging tool provided with Lync. The Protocol you are interested in is PDP. Another cool tool is the bandwidth monitoring tool provided in the Resource Toolkit. That will show you currently used BW and also existing limits imposed via bandwidth policies.
Should you be in a lab environment, there is a good chance you will not find any CAC entries in the PDP logfile (look for the keywords: requested,current,returned) and the monitoring tool shows Zeros throughout.

CAC is initiated from the client. Effectively the client must know to start asking for a bandwidth for it to turn up in the logs and CAC to do its magic. If the client does not know that CAC is enabled it will not ask. Thus make sure you exit out of the test clients completely and also run gpupdate /force on each test machine after you enabled CAC.

Another universal fix it when Lync clients are misbehaving is to delete the Cache file. For best results go to:
%userprofile%\AppData\Local\Microsoft\Communicator and delete every folder starting with sip_. That will clear out any cached values on your client (and the contact list too) and will force it to pull the latest updates. Careful! that will also delete any connection settings that you might have used for manual configuration instead of automatic configuration.

Finally, try establishing an IM conversation first and adding Audio to that running session. That has proven to be most effective when clients refused to request bw from CAC. After that also non IM initiated sessions request BW.
If you still can't see anything happening on the monitor or requests in the log files I suggest you have a closer look at your Edge and Front End Servers. Make sure all services are running and restart the Lync Bandwidth related services.

Adding answers to IVR Response Group workflow via Powershell


Introduction

in Lync 2010 you can use the web interface to create Interactive Response Groups with up to four possible answers and up to two levels of nested questions. Should you want to add a fifth question to the IVR workflow you will have to revert to using Powershell commands. Although you can create workflows from scratch using Powershell (great article to be found here:

http://blogs.technet.com/b/csps/archive/2010/09/15/rgscreateresponsegroup.aspx) you can also edit existing workflows instead of creating them completely new. In this post I will show you how to get hold of an existing workflow using Powershell, add another possible answer to a question and then save the workflow to commit the changes.

Editing an existing Lync IVR Workflow using Powershell

First you will need to get hold of the workflow objects so you can add new answers to them. First we will retrieve the workflow by its name. Then get hold of an existing queue by its name. This queue can have been created in the control panel. Finally we'll get hold of the first question of the workflow to which we plan to add a fifth option.

$Workflow = Get-CSRgsWorkflow -Name "MyWorkflowName"
$Queue = Get-CsRgsQueue -Name "MyQueue"
$Question = $workflow.DefaultAction.Question


The next step is to create another action and answer and add them to the question bank. First we create an action which will transfer the call to a queue and specify the previously referenced queue. Then we create a fifth option for the question with dtmf tone 5 and the spoken version of "Option5". Then we add the answer to the question list.
$Action5 = New-CsRgsCallAction
                   -Action TransferToQueue
                   -QueueID $Queue.Identity
$Answer5 = New-CsRgsAnswer
                   -Action $Action5
                   -DtmfResponse 5
                   -VoiceResponseList "Option5"

$Question.AnswerList.Add($Answer5)


Finally you need to save the workflow back again using the Set-CsRgsWorkflow Command

Set-CsRgsWorkflow -Instance $Workflow

That's it! You've created and added a fifth possible answer to an IVR Workflow which previously only had four possible answers. Please note, that from now on you will not be able to edit the workflow using the web interface and will get an unsupported error message instead.

Friday, August 26, 2011

Asterisk disk full

Just after my last class today I though I'd copy the asterisk box from the presenter machine to my backup drive. I was shocked to see a 44GB file! Obviously there was some logging problem bloating the hard disk, but it was not asterisk's log files under var/asterisk but instead core files found in tmp.
These core files are effectively memory dump files created by CentOS when Asterisk mibehaves and crashes. I had 43 GB worth of dump files already. The problem is, that simple errors in your sip.conf can cause Asterisk to crash and restart and subsequently crash, going on until your HDD is full of dump files.
I found a easy post on how to control the core dumps here:
http://aplawrence.com/Linux/limit_core_files.html

But that only combats half the problem. The main part was why was asterisk is crashing so often. the answer lied in my case in the qualify=yes entries in my sip.config. If the Lync Mediation server does not expect to respond to the gateway (topology not published yet, Mediation Service not restarted) or is not online at all then the qualify will fail. Worse even, it will crash my Asterisk 1.6 box causing those dumps to quickly fill up the HDD (50MB per dump). So setting qualify=no has helped me keep my asterisk box more stable between my demonstrations.

Thursday, June 23, 2011

Using 2talk SIP Provider with Lync

yesterday I posted about using Asterisk with Lync. Today I will share my configuration files for getting 2Talk up and running with the latest version of Asterisk 1.6:
Effectively there are two configuration files you need to worry about. The sip and the extensions files.
The sip.conf file specifies the different sip trunks and authentication mechanisms. The extensions.conf is your dialplan, which can be the trickiest to get working. Below are my versions (passwords removed)

sip.conf:

[general]
registerattempts=0
registertimeout=20
allowoverlap=no ; Disable overlap dialing support. (Default is yes)
udpbindaddr=0.0.0.0 ; IP address to bind UDP listen socket to (0.0.0.0 binds to all)
bindport=5060
bindaddr=0.0.0.0
tcpenable=yes ; Enable server for incoming TCP connections (default is no)
tcpbindaddr=0.0.0.0 ; IP address for TCP server to bind to (0.0.0.0 binds to all interfaces)
srvlookup=yes ; Enable DNS SRV lookups on outbound calls
notifyhold = yes

;register 2Talk number to receive incoming calls (replace placeholders with values, remove brackets)
register => :@2talk.co.nz/

[1001] ; A locally attached SIP extension (in my case an X-Lite client)
type=friend
callerid=1001
canreinvite=no
dtmfmode=rfc2833
mailbox=1001
disallow=all
allow=ulaw
transport=udp
secret=password
host=dynamic
context=default



[2talk]
type=friend
username=
fromuser=
secret=
host=2talk.co.nz
context=from-2talk ; going to use this in extensions.conf
dtmfmode=rfc2833
disallow=all
allow=ilbc
allow=gsm
allow=alaw
allow=ulaw
;allow=g729 ; only if you have licenses to use it
nat=yes
canreinvite=no
insecure=invite,port ; use insecure=very in earlier versions of Asterisk such as v1.2


[Lync_Trunk] ; Our Lync trunk
type=friend
port=5068 ; This is the default Lync Server TCP listening port
host= ; This should be the IP address of your Lync Mediation Server
dtmfmode=rfc2833
context=from-lync
qualify=yes
transport=tcp,udp

extensions.conf

[general]

static=yes
writeprotect=no

[globals]

[default]
;calls to a 4 digit extension starting with one are routed directly via SIP to local phones
exten => _1XXX,1,Dial(SIP/${EXTEN},20)
exten => _1XXX,2,hangup()

; outbound calls (outside of your own PBX) (only used for internal phones, not calls originating from Lync)
exten => _0.,1,Dial(SIP/${EXTEN:1}@2talk)
exten => _0.,2,hangup()

;calls coming in locally going to a 4 digit number starting with 2 are redirected over the Lync trunk
exten => _2XXX,1,Dial(SIP/Lync_Trunk/${EXTEN},20)
exten => _2XXX,2,hangup()

[from-lync]
;dialling other extensions starting with 1 followed by three digits are sent locally
exten=>_1XXX,1,Dial(SIP/${EXTEN},20)
exten=>_1XXX,n,hangup()

;send other calls to 2talk for Asterisk (no prefix here as Lync will probably have its own dialplan)
exten => _.,1,Dial(SIP/${EXTEN}@2talk)
exten => _.,2,hangup()


[from-2talk]
;send incoming calls on your 2talk number to a Lync Extension
exten => _X.,1,Dial(SIP/Lync_Trunk/2001)


That's it folks! Try using the two configurations with your 2talk SIP provider, add your own number, secret and Lync IP address and start making and receiving calls over Lync.

Tuesday, June 21, 2011

Using Asteriks with Lync 2010

Next week I am teaching how to install and configure Lync using the material provided in the course10533 from Microsoft. Although the course has some great stuff, it misses out on the really juicy bits. Setting up a PSTN connection for example.
I simply could not resist setting it up for myself. After some digging I found this GREAT! Post by Adam:
http://imaucblog.com/archive/2010/10/09/step-by-step-microsoft-lync-2010-asterisk-and-skype-installationintegration-guide/

But then I need to get my Virtual machines connected to the internet! Now ICS has proven to be rather icky. And Bridging really mucks things up. So in the end only a good ol Routing Service would do the trick.
John Paul has a nice article on how to configure Hyper-V to route traffic to your wireless adapter:
http://sqlblog.com/blogs/john_paul_cook/archive/2008/03/23/using-wireless-with-hyper-v.aspx

WARNING! Skip the first quarter of the screenshots where he shows bridging. don't wanna do that. jump down to where he starts talking about RRAS

Once I had routing set up and could connect to the net using the private ip-ranges I wanted, I was ready to install Asterisk using Adam's instructions.

Some tips for getting it all running:
No need to buy Skype Plugin if you wanna use sip. I used 2talk and am happy with the results. For the detailled config for 2talk try this page:
http://blog.2talk.co.nz/asterisk.html

IK found a mix between the settings from Adam and the ones from 2talk worked best.

WARNING! A typo or bad rule can send your Asterisk server into an endless crash loop. To fix that, copy the 2talk configurations back into the files and start over again.

Some helping commands I used along the way on the Linux box:
ifconfig (set a new IP address)
route (add a default route to connect to Gateway)
ping (test ip connectivity)
nslookup (test dns server connectivity)
ifconfig and route take effect immediately on the linux box. The changes made via the network setup program only come to life on the next reboot (which took about 5-10 minutes each time)
 
asterisk -r (connect to asterisk console)
  reload (reload all configs)
  diaplan reload (reload only the dialplan)
  sip show peers (see what routes are setup and working)

And as I hate using the command line editor, downlad a browser/editor for Windows ASAP, such as WinSCP: http://winscp.net/eng/download.php
You will need that editor to edit the /etc/asterisk/sip.conf and /etc/asterisk/extensions.conf files

Whenever you make a change to sip.conf or extensions.conf make sure to run the reload command in asterisk again.
Also, staying logged in to Asterisk will show what is going on and is useful when debugging conversation attempts.
Finally, try getting it working step by step.
1) connect X-Lite to SIP provider via Asterisk
2) connect X-Lite to Lync
3) connect Lync to X-Lite
4) connect Lync to SIP provider
Step 1 should be your priority. Once that works the rest falls into place.


Happy configuring. Having it all working is an exhilerating feeling. you'll know what I mean once you have made your first Lync phone call to a real land line.

Monday, June 13, 2011

Hierarchies inspired by Tube Maps

One of the probably most famous and useful navigation resource is the London Underground map. It is truly a masterpiece of navigational aid and has hardly changed since its inception. When reviewing the different possible ways to map out a navigation I came across Patrick Walsh's post : http://www.boxesandarrows.com/view/a-map-based-approach

There he describes a process of using Visio to create a tube map like representation of an intranet. A sample could look as such:

Using interconnections, call-outs and differently coloured lines it now becomes easier to show the different paths a user could take through the system, how the navigation nodes relate to each other and where cross linking is used to help navigation.

Also it should become easier to add lots more links onto a page while keeping it printable and overviewable.

Donna Maurer (http://www.maadmob.net/donna/blog/archives/000639.html) has already found the benefits of adding actualt data to a visio diagram, which makes updating th diagrams with new titles or descriptions a breeze. As datasource you can use either a excel file, Access database or even a SharePoint list!

A few tips for when you are working your way through connecting data to your shapes:
1) call the field which contains the url to the page Url. That way visio will automatically link the whole shape. You can rename a field in Visio, no need to fix the source.
2) create custom representations for your data using the Data Graphics button on the Visio 2010 ribbon.
3) create four different options, one for title above, one for below and one for each side. Changing one option will change all instances of that option. Thus, you will need to create four separate ones.

I'll be handing out samples of the visio I used to create a map at my IA course in Wellington which you can use straight away to build your own metro sitemaps!

Saturday, June 4, 2011

Introduction to SharePoint Information Architecture

It is my pleasure to announce the newest course offering by Equinox in Wellington:
http://www.equinox.co.nz/training/Pages/CourseDescriptions/sharepoint-information-architecture-introduction.aspx
I've teamed up with Equinox to deliver three SharePoint Information Architecture courses in Wellington. A one day introduction which covers the basics of IA as well as the basics of SharePoint. A two day course which then goes into the depths of creating taxonomies and hierarchies in SharePoint which can withstand the test of time. Finishing off with an advanced course which covers records management, governance, scalability and other advanced concepts over another two days.
I've been spending many hours online and in libraries prepping the contents of the introduction course and while I will be demystifing some SharePoint myths during the course I also am looking forward to the interactive sessions with the students. IA simply is not something you can learn by listening to speeches or watching viedos. It really only sinks in when you do it for real, so expect some fun and enlightening hands on session to drive home the concepts learnt!

We'll be running the course on June 22nd. Places are limited; so make sure you book your place soon to avoid disappointment!
http://www.equinox.co.nz/training/Pages/CourseDescriptions/sharepoint-information-architecture-introduction.aspx

Expired Virtual Machines for 10232

Thank God, Microsoft Learning is fixing the date issue on the 10232 virtual machines. Looks like this time they didn't get away with "it is not broken, it is by design". So if you are planning on teaching Advanced SharePoint Development techniques any time soon, then make sure to get your hands on the newest images. They should be online within the next week or so.
Btw, 10232 is full of theory and "best practice" concepts. Great for solution architects, team leads and senior developers. Not really the stuff junior SharePoint developers are keen on. They just want to get their hands dirty coding WebParts.
Also, encourage the client to send their newbie SharePoint developers onto 10174 first before visiting any of the advanced courses. Although it is an admin focussed course it covers all the essentials which a developer needs to know about.

Thursday, April 21, 2011

MCT Tips and Tricks for 10232

Fellow MCTs,
I am posting this for the benefit of those MCTs who have not figured out that there is a whole wiki page at http://borntolearn.mslearn.net on this course once you log in with your MCT LiveID account.

10232 was built with a trial key for SharePoint that has a fixed expiry date. You must turn the clock back to August 2010 (tested using 10th and 11th/08) to teach this course. Thanks go to Chris Barker for being the first to post a solution to this problem at the MCT forum. The key to getting the course to run is to set the clock of the host back before importing the VM and starting it up. This is due to the host setting the bios date of the vm on initial bootup. So changing the time after the fact in the vm does not help as the SharePoint key has invalidated itself by then.
BUT...
When setting the clock back to a date before the host was built you might have troubles getting any vm to start with a certificate failure. This is due to a certificate HyperV uses when initialising the VM. To solve the problem you need to delete the self-signed cert and reboot the machine. That will force Hyper-V to recreate the cert at the older date and the VMs will start. To do this follow these steps:
1) open MMC and add the Certifcates Snap-in
2) Select the service account option
3) Pick the HyperV Virtual Machine Manager service
4) drill down into the own Certificate folder (first folder)
5) delete the self signed cert with the machine name
6) reboot

Last Tips for a smooth running course:
Configure the machine to use an internal network so you can remote into the box from the host. That will make copy and paste much easier and also allows you to put fiddler on the machines. (my students loved seeing fiddler in action and could not stop fiddling with it)
Change the Integration settings for the vm to not synch the time. That will stop the host from updating the time in the VM. This does not work on initial boot, as the initial time and date on first start of the vm will always be set based on the current host regardless of that setting. But it should prevent the machine to take on a new date while it is running.
Then make a Snapshot!!!
Should the student have changed the time on the host during the course and the time did synch due to some fluke, you can easily revert back to the snapshot which has the correct time settings and a functioning SharePoint instance.

Happy teaching!

Wednesday, April 20, 2011

SharePoint Diagnostics Studio V2.0

This will be one of the most anticipated admin and dev tools released for SharePoint this year. It will come bundled with the updated version of the administrator toolkit for SharePoint 2010.
Ever had to trawl through multiple logs from multiple load balanced servers to figure out what really happened? Wished you could have your perfmon counters ready when analysing problems? The next version of the SPDiag tool will let you do that and much more. Run reports, save report snapshots, find events based on time, CorrelationID and... the login name! So Jo complains his page crashed and he did not make not of the correlation id? No problem. correlate all logs into one place on your desktop with minimal strain on the servers. Then run a search on his logon and the time window when it crashed and you'll have all the surrounding details, processes and counters available at your fingertips.
Also includes database stats, network stats and latency statistics to identify bottlenecks with some of you service app communications.
This tool was developed by the product team at Microsoft to assist them with the SharePoint development process. What's good for them can't be bad for us mere mortals!
Can't wait to get my hands on this gem. Supposedly coming out this month. Which really means sometime this year, probably during Q2 some time.

Friday, March 18, 2011

How to Add Custom Columns to the Search Results in MOSS 2007

A few years ago I demonstrated at the New Zealand Conference how to add custom metadata columns to your search results. What I never did was make a blog post about it. So let me rectify that error of mine.

To get custom columns to appear on your search results page in MOSS you will need to do a few things. First make sure the metadata field is indexed and available by adding it to the Mapped Metadata Columns in the Search Administration of your SSP. Then, recrawl your index. And finally show the new column on the search results page by tweaking some xml and xsl.

1) Get the column indexed and mapped
To do this, open up your SSP admin site either directly or via Central Admin. Then navigate to the Search Administration Page and look for Managed Columns on the left menu. There you will need to add a new managed colum which mapps your custom field. This field could be in a SQL Store indexed via BCS or in a custom list or in a custom content type. In any case I assume that you have already crawled all the content in question, otherwise this step won't work.
2) Recrawl the index
Once you have added the managed metadata colum you need to do a full crawl. Sorry, no incremental crawls will work, as only in a full crawl does the engine do the mapping of all existing fields in the index.
3) Customise the Search Results
Browse to the search results page in your Search Centre. Fastest way? Type in a search into any search box! If you farm is configured correctly that should catapult you to the correct page.
There you need to edit the page and customise the Core Search Results WebPart.

Before you edit the actual XSL Look and feel you need to tell the webpart about the new column. So first click the three dots next to "Selected Columns". You find this option under the ResultsQuery Option. (refer to above screenshot) Copy the text out of the editor into notepad. You'll get this:
Well, you won't get that exactly. It will all be on one line and illegible. Feel free to use an xml editor of your choice instead of notepad.
In that xml you now need to add the Mapped Metadata column name. Make sure you use the correct case. XML is case sensitive! Also add the name you gave the mapped column and not the source metadata column (in case these two have different names). Once you have added the column to the xml text copy it back into the popup for the "Select Columns: option.

(note the new MyCustomColumn entry at the bottom)
Once you have added the xml for the new column you can add the column to the output in the XSLT.
Click the XSL Editor button on the WebPart customisation interface (see first screenshot) and add the new field to your xsl.

Extract before you add the markup:
And now with the new data appearing right next to the description of the search result:


That's it. Copy the changed xsl back into the XSL editor popup, apply your changes to the web part and test your results by executing a fresh search which includes some of the custom metadata that you have indexed.
The next step would be to create a custom scope for your BCD indexed data, create a custom search page and search results page for that scope and go ape-shit on the search results page with loads of custom columns and custom look and feel.
Happy Customising folks!

Monday, March 14, 2011

How to pass 70-576

Totally forgot to post the fourth post in the series. If you want to earn your sharepoint 2010 professional developer credential then you'll also need to pass this exam. Measureup does not have this one on it's books yet, so no legal cheating on this one.
The questions are not quite scenario based as with many of the .Net developer pro exams. But not quite as code driven as the TS exams. Expect simple scenarios with simple seeming answers.
Actually there are quite a few trick answers where there seems to be two valid choices until you spot the error. My best advice on this exam is to eliminate the wrong answers. Quite a few will be obvious bull answers, but there will be the occasional one that will make you doubt. Re-read the question. Often there is a clue or constraint in the mini scenario which will eliminate the second option.
A lot of questions are common sense on development and deployment techniques. Make sure you're up to scratch on best practices for features, solutions, dependencies, upgrading and the new 2010 features.
Warning! Only because it is a new feature in 2010 doesn't mean that it must be the right choice!
Last tip, as with any Microsoft exam, stick to your first gut choice when in doubt. And only move from that when you are 100% sure that the new answer is the correct one.



Thursday, March 10, 2011

Air New Zealand 777-300 Review

Just had the pleasure of viewing one of the new 777-300 planes from Air New Zealand. A lot of good things and a few bad things to comment on:
Great:

  • Arm rests fold up in economy. Great if travelling as my wife and I normally do.
  • Sky couch can hold two people who weigh up to 150kg each. (300 Pounds on the leg rest alone)
  • Touch Screens work like a charm and at last no need for the annoying remote control
  • Remote/Phone stowed in-front and not in the side (why did they not think of that earlier?)
  • New menu is supposed to be much better (I'll wait and see)
  • Power sockets also in economy. (Which laptop battery lasts for 12h-24h?)
  • Headrest holds up! (That drove me mad in the old planes. Always falling down again.)
  • Special cushions that attach to headrest. No more fussing with the cushion when standing up and sitting back down again.
  • Sections are shorter with an added canteen in the middle. Means less people running past and faster service all round.

Not so Great:

  • Seats narrower and Aisles narrower in economy.
  • Premium Economy lacks leg-room due to hard case shell design of seats. Max height for premium economy passenger: 178cm. Anything beyond that and you're bound to get sore legs as you can't stretch them completely. 
  • SkyCouch not very spacious. Gonna be very cuddly! And be sure not to stretch your legs or you'll hit the person in the middle aisle.

Secret Tips:

  • Three rows of two seaters at the back of the plane. Great for couples. Also have 1inch more leg-room!
  • Don't take a seat in the 3-person row right in-front of those two-seaters. Armrests don't go all the way up but are stuck at horizontal. 
Summary:
All in all I am looking forward to flying long haul in the new Boeing 300s. It looks like it will be very enojyable  and no, I won't be upgrading to premium economy on those flights. Unless I could choose my seats before booking the flight. Now that would be just perfect!

Wednesday, February 16, 2011

The Social Media Revolution

Who has once in their life had the inner desire to change the world? You know - , when you were a kid -, that innocent wish for world peace?
Or better yet… who has never hoped to change a thing because it seems too hard or simply impossible to achieve?

Well, I’ve got news for you!
Never before has it been easier to change the world, or at least the little piece of the world that surrounds you.

Ever since Wikileaks began publishing confidential information back in 2007 the world has been in uproar. And the US government has tried to silence the truths about its involvement and handling of international affairs ever since.

Although it is ludicrous to claim Wikileaks as being responsible for the recent revolutions in Tunisia and Egypt, it has nevertheless helped raise international awareness of the widespread corruption, thus making it just a little harder for dictatorships to diffuse these situations.
Without Wikileaks the Vice President in Egypt would have been in a much stronger position, but now that the whole world knows that he is buddies with Israel, his presence alone added to the fuel of the burning fires on the streets of Egypt.
And what was the first thing the Egyptian president did when the revolts got out of hand? Shut down access to Twitter and Facebook. When the uprising started in Egypt everyone was fast to blame it on Islamic militants. But the tens of thousands of people gathering on parliament square were a far cry from blood curdling terrorists! They were lawyers, shop keepers and engineers. The average Jo Bloggs who simply had had enough!! They were sick and tired of the violence, oppression and corruption in their country. These people did not attend secret rebel meetings before joining the demonstrations. Nor did they subscribe to the anarchist grapevine. No, they heard about the movement on Facebook, Twitter and the local media and decided it was time to act and stand up for their rights.

The Jasmine revolution in Tunisia started with Mohammed Bouazizi setting himself on fire in an act of desperation after being tyrannised and humiliated in public. It ended in a mass revolt which spread throughout the country, partly organised through Facebook and Twitter. One of the reasons these tools were so effective in spreading the word in Tunisia was due to the high percentage of youths in that country. Over half the population is under 25 years of age.
Tunisia and Egypt; these are two prime examples where social media helped spread the word and generate a critical mass that lead to drastic changes in those countries.

Another group of people using social media as a weapon to gather support for their cause is Greenpeace. Through Facebook and YouTube channels they have forced Burger King and Nestle to look for more sustainable sources of palm kernel. They also put a big dent in Fonterra’s online brand who effectively committed “Facebook Suicide” by totally mismanaging the response to the online pressure.
One of their most effective tools was a YouTube video showing how a jogger is choking on orang-utan hair while drinking a glass of Fronterra milk. That video was so effective, that they were forced to pull it from their YouTube Channel due to legal actions from Fonterra.

YouTube has become not only the most favourite source of daytime entertainment but has proven itself a powerful viral communication tool to spread the word. Wikileaks first published the video of American soldiers massacring Iraqi civilians on YouTube before it was removed due to pressure from the US government. And posting YouTube videos is becoming easier as the day progresses. Google predicts that there will be more smartphones in the world than computers by 2016. Making a video and uploading it to your YouTube channel is just a click away.

I’m not saying that we all need to start toppling governments or chain us to trees. But we all can change the world if we want to. The tools are there for us, we only need to use them. The next time you see a person being beat up on the street, take out your phone and make a video. Upload it to youtube before you turn the corner (and run) . You find out about a teacher abusing children? Tell your friends on Facebook about it! You’d be astonished at how many people will start to listen if you start to talk. Also mention the good things in life! Someone helped you on the street when looking for directions, why not say thank you via Twitter?

The world is listening. All you need to do is start talking and changing the world around you one tweet at a time.

Tuesday, February 8, 2011

Enabling RBS on additional Databases

You'd think it to be really easy to find good information on enabling RBS for additional content databases out there. But if you landed on this blog post, then you were proven wrong, like so many others.
Actually it is not that hard.
Once you have followed all the instructions to set up RBS for the first time:
http://technet.microsoft.com/en-us/library/ee663474.aspx (SharePoint Foundation 2010)
http://technet.microsoft.com/en-us/library/ee748631.aspx (SharePoint Server 2010)
You then need to follow these steps for each additional ContentDB:
http://technet.microsoft.com/en-us/library/ee748641.aspx (Universal)

The important part in the last link is the different msi script. As you use different ADDLOCAL parameters.
Also keep in mind that if you are relying on the FILESTREAM Provider in SQL 2008 R2 you need not install RBS on the SQL Server itself, as it is supported by default. If you plan on using RBS using a third party Provider, then you will need to run the msi on the SQL Server too:
http://technet.microsoft.com/en-us/library/ff629463.aspx (SharePoint Server 2010 without FILESTREAM)

Happy Remote Blobbing!

Tuesday, February 1, 2011

Keeping Dead Wood out of SharePoint. Or, how to avoid Zombies.

In his article, What NOT to migrate into SharePoint Joel identifies the usual suspects of types of files which are not suited to be stored in SharePoint or any DMS system for that matter.

While Joel covers the what quite exquisitely, he does not cover the why from an IA point of view. I have seen companies throw hundred thousands of dollars at setting up SharePoint farms to store ALL their documents. Including massive investments in infrastructure and likewise projects required to design a stable and usable system with minimal change impact to the end users. But what they all forget to ask is the big WHY? Why migrate the content? Where is the business value? The standard answers you will get most of the time will go along the lines of “because with SharePoint we can manage the files better” “We then have versioning” “One central platform instead of disparate locations”. And that is when the headache begins. Dead documents are brought to life in SharePoint only for the sake of having one single source of truth. These zombie documents devour resources and money the instant they are resurrected and cause major maintenance headaches for all involved.

When the team of BAs and IAs are classifying all the documents they often skip the most important classification. Is this document still active or is it at the end of its life? Will there be any form of collaboration going on or will it only be used as reference material. If you can make that distinction you will easily be able to identify exactly which files need to move into a managed environment and which can stay in an unmanaged location. Many projects shy away from this question simply because it can be a daunting task. There might be millions of documents that need to be classified into active and non active. Also there is the problem of having separate locations for associated documents. This is where third party providers can come in extremely handy. While some offer Web Parts which will display existing file shares in SharePoint in a more integrated manner, they will lack the full functionality support of a SharePoint document library. So if you do not need versioning or SharePoint driven security, then have a look for Network Share Web Parts on http://www.sharepointreviews.com/ There are some web parts listed there.

If you on the other hand want to have versioning, SharePoint driven security, Check-in and out, then more complex solutions will be needed. Such as the File Share Library by bamboo solutions which synchronises a fileshare with a document library. The only downside is that files will start clogging up the database once they move from stale to active use. This way at least the databases will grow gradually over time as more and more documents are activated. The final solution I want to suggest is using the FileShare connector from AvePoint. They have designed a solution based on their popular EBS driven Storage Extender which keeps the file share separate from the database even once files are actively used and versioned in SharePoint. The magic lies in a hidden folder on the fileshare which is used to maintain the new versions of the files, as they progress through the lifecycle.

But beware! Only because the DBA need not worry about 5TB big content databases no more means nobody should. A well planned major and minor versioning strategy and user education is key in keeping the network storage from blowing out of proportions. So do yourself a favour and have a look at some tools on the market before needlessly resurrecting dead documents into SharePoint.

Friday, January 28, 2011

Microsoft Learning Snacks now on YouTube

The team at Microsoft Learning have realised that YouTube is a much better channel for on-line video based learning than hosting them on their own sites. I totally agree! subscribe now to the Microsoft learning channel on YouTube to get your daily fix of short learning videos ranging from using office to my favourite topic, programming SharePoint
Here a sample Learning Snack: LINQ to SharePoint!