This section describes the workloads placed on the product during performance gathering, the hardware used during the testing and the topology for how that hardware was deployed, and some brief notes about the dataset used during the testing. Combined, these three aspects provide an overview of the performance of the Office Web Apps.
Workload
For the Word Web App and PowerPoint® Web App it is important to consider viewing documents separately from editing them, as these two modes are serviced differently by the server deployment and have different performance characteristics. In the OneNote® Web App, the distinction is much less and hence need not be made when considering capacity.
The list of workloads tested are:
Viewing documents in the Word Web App and Viewing presentations in the PowerPoint Web App
Editing documents in the Word Web App and Editing presentations in the PowerPoint Web App
Viewing PowerPoint Broadcasts in the PowerPoint Web App as attendees
Viewing/Editing OneNote Web App notebooks
The testing for these workloads was designed to help develop estimates of how different farm configurations respond to changes to the following variables:
Mix of which Web Apps are used how often
Effect of cache hit rate on viewing previously rendered documents/presentations
Type of documents/presentations and expected mix of requests
It is important to note that the specific capacity and performance figures presented in this article will be different from the figures in real-world environments. The figures presented are intended to provide a starting point for the design of an appropriately scaled environment. After you have completed your initial system design, test the configuration to determine whether your system will support the factors in your environment.
Test definitions
This section defines the test scenarios and provides an overview of the test process that was used for each scenario. Detailed information such as test results and specific parameters are given in each of the test results sections later in this article.
Word Web App viewing
Each of the tests below were performed twice, once when the output format of the document was PNG, once when it was Silverlight. The exact mix of how much each test was used is included further below in the Test Mix section.
Test name
|
Test description
|
Full Document Reading
|
Open the document.
Scroll to the next page, pausing on each page.
Scroll to the last page.
Close document.
|
Multiple Search And Read
|
Open the document.
Scroll to a random page.
Execute a find command, navigate to a result.
Scroll to a random page.
Execute a second find command, navigate to a result.
Close document.
|
Single Search And Read
|
Open the document.
Execute a find command, navigate to a result.
Scroll to each subsequent page until end of document.
Close document.
|
Wrong Document Read
|
Open the document.
Scroll to second page.
Close document.t
|
Print
|
Print the document to PDF format.
|
Word Web App editing
Test name
|
Test description
|
Full Editing
|
Load the Word Editor.
Load the document.
Spell check the document and then pause.
Simulate typing – perform various saves & spelling requests with wait times in between.
Close the document.
|
PowerPoint Web App viewing
Test name
|
Test description
|
Full Viewing
|
Open the PowerPoint Viewer.
Load the presentation.
View the slide and pause
Continue to next slide and pause.
Repeat until end of presentation.
|
PowerPoint Web App editing
Test name
|
Test description
|
Full Editing
|
Open the PowerPoint Viewer.
Load the presentation.
View the slide, have a 75% chance to edit text object, and pause.
Continue to next slide, have a 75% chance to edit text object, and pause.
Repeat until end of presentation.
Close the presentation and save,
|
PowerPoint Web App Broadcast
Test name
|
Test description
|
Full Broadcast
|
Create a PowerPoint Broadcast from a presentation.
View each Broadcast in the PowerPoint Viewer with five different attendees.
Trigger viewing of the slide and pause.
Continue to next slide and pause.
Repeat until end of presentation.
End the Broadcast.
|
OneNote Web App
Test name
|
Test description
|
Collaboration Scenario #1
|
Sync interval is manipulated to be every 5 seconds.
Load Notebook.
Click on a new page and pause.
1 minute worth of editing, followed by spell checking.
Another minute worth of editing and spell checking.
Insert an image.
Paste a large amount of data into the page, and spell check.
Make changes to pasted data.
Delete some content.
|
Collaboration Scenario #2
|
Sync interval is manipulated to be every 5 seconds.
Load notebook.
Periodically save the notebook at random intervals.
|
Single User Scenario #1
|
Sync interval is manipulated to be every 30 seconds.
Load Notebook.
Click on a new page and pause.
Make an edit that causes a time-based version to be pinned.
Two minutes worth of edits, followed by spell checking.
Insert an image.
Paste a large chunk of data into the page.
Delete some content.
|
Single User Scenario #2
|
Sync interval is manipulated to be every 30 seconds.
Load notebook.
Periodically save the notebook at random intervals.
|
Test mix
Word Web App viewing
Solution name
|
Output Format
|
% in the mix
|
Full Document Reading
|
PNG
|
9.75
|
|
SL
|
3.25
|
Multiple Search and Read
|
PNG
|
40.5
|
|
SL
|
13.5
|
Single Search and Read
|
PNG
|
17.25
|
|
SL
|
5.75
|
Wrong Document Read
|
PNG
|
4.5
|
|
SL
|
1.5
|
Print
|
PDF
|
4
|
Word Web App editing
Solution name
|
% in the mix
|
Full Editing
|
100
|
OneNote Web App
Solution name
|
% in the mix
|
Collaboration Scenario #1
|
5
|
Collaboration Scenario #2
|
5
|
Single User Scenario #1
|
45
|
Single User Scenario #2
|
45
|
PowerPoint Web App viewing
Solution name
|
% in the mix
|
Full Viewing
|
100
|
PowerPoint Web App editing
Solution name
|
% in the mix
|
Full Editing
|
100
|
PowerPoint Web App Broadcast
Solution name
|
% in the mix
|
Full Broadcast
|
100
|
Hardware setting and topology Lab hardware
To provide a high level of test-result detail, several farm configurations were used for testing. Farm configurations ranged from one to six Web servers and a single database server computer that is running Microsoft® SQL Server® 2008 database software. Testing was performed with several client computers. All Web server computers and the database server were 64-bit, and the client computers were 32-bit. No other SharePoint Server-specific load was occurring during testing, and the only machines that were manipulated were those serving Web App requests.
This document focuses on the effect of the web front end machines as well as the app servers, and how their characteristics relate to Web App capacity.
The following table lists the specific hardware that was used for testing.
Machine name
|
WFE1-8
|
App Servers
|
SPSQL
|
Role
|
WFE
|
App
|
SQL Server
|
Processor(s)
|
2 processors, 4 cores each @2.33 GHz
|
2 processors, 4 cores each, @2.33 GHz
|
4 processors, 4 cores each, @3.2 GHz
|
RAM
|
8 GB
|
8 GB
|
16 GB
|
Operating System
|
Windows Server® 2008 SP2 x64
|
Windows Server 2008 SP2 x64
|
Windows Server 2008 SP2 x64
|
Storage & its geometry (including SQL Server disks configuration)
|
6 + 75 + 590 GB
|
6 + 75 + 590 GB
|
6 + 75 + 460 GB
|
# of NICs
|
2
|
2
|
2
|
NIC speed
|
1 GB
|
1 GB
|
1 GB
|
Authentication
|
Basic
|
NTLM
|
NTLM
|
Software version
|
4753.1000
|
4753.1000
|
SQL Server 2008
|
# of instances of SQL Server
|
|
|
1
|
Load balancer type
|
NLB
|
|
|
ULS Logging level
|
Medium
|
Medium
|
Medium
|
Topology
Different applications require different topologies. In some cases, where more than one machine role is required to fulfill a request, different topologies were tested where the ratio of front-end Web servers to application servers was varied. In these cases, in the tables below the “Bottleneck” column describes which tier ran out of headroom first, whether it was the front-end Web servers or the application servers. This information is useful when it’s known how heavy a deployment will be on its front-end Web servers– if there is a lot of load already on front-end Web servers, then deploying the Web Apps in a topology where the application servers run out of headroom first would result in the least amount of additional load placed on the front-end Web servers.
For Word Web App Viewing with no cache hits, PowerPoint Web App Viewing with no cache hits, PowerPoint Web App Editing, and PowerPoint Broadcast, an application server is necessary to render the document before it is displayed to the end user. The following shows a 1x2 topology, representing one front-end Web server to two application servers.
For Word Web App Viewing serving cached documents or PowerPoint Web App Viewing serving cached documents, only a front-end Web serveris necessary. Similarly, Word Web App Editing and OneNote only require front-end Web servers. The following shows a basic topology with 1 front-end Web server that can handle these types of workloads (note that the application server would still be deployed and involved in serving Word Web App requests that are not already cached. The application servers are not drawn here to indicate which machines are necessary to service these types of requests).
Dataset
The dataset used for the Web App tests was a series of documents, all in Microsoft Office 2007 file format.
For Word, the documents used ranged in size from 10 to 216 KB, 1 to 30 pages, and 0 all the way up to 7000 words in length. Some documents were simple involving little formatting, while some were quite complex in the number of different styles and formatting used.
For OneNote, all tests began with new, blank workbooks which increased in size and complexity as the tests progressed.
For PowerPoint, the presentations used ranged in size from 250 to 1275 KB and contained on average 15 slides. The presentations similarly contained a range of different types of content.
Test results
The following tables show the test results of the Office Web Apps in SharePoint Server 2010. For each group of tests, only certain specific variables are changed to show the progressive impact on farm performance.
Note that the tests reported on in this article for the Word and OneNote Web Apps include think time, a natural delay between consecutive operations designed to simulate the pauses generated by a user as they examine the results of their last request to the server and determine the next request they will make. These included think times are only an approximation of what may be seen in a real-world environment.
For information about bottlenecks in the Office Web Apps in SharePoint Server 2010, see the Common bottlenecks and their causes section later in this article.
Word Web App viewing, no cache hits
The details below give an indication of results for a topology where the web front ends and the app server back ends are changed (that is, a 1x2 would be one front end with two app servers, all supported by an instance of SQL Server). The user count is an estimate on the number of users that are actively viewing a document using the Word Web App that the topology could support.
Topology
|
RPS
|
Average Response Time
|
Bottleneck
|
Average WFE CPU
|
Average App Server CPU
|
SQL Server CPU
|
# of active users supported
|
1x1
|
25
|
0.2 seconds
|
App Server
|
8%
|
48%
|
2%
|
860
|
1x2
|
33
|
0.16 seconds
|
HTTP throttling on front-end Web server
|
8.5%
|
38%
|
2.5%
|
1040
|
2x2
|
48
|
0.16 seconds
|
App Servers
|
8%
|
49%
|
3.5%
|
1600
|
2x3
|
64
|
0.15 seconds
|
HTTP throttling on front-end Web servers
|
10%
|
42%
|
5%
|
2100
|
3x3
|
65
|
0.12 seconds
|
App Servers
|
7%
|
45%
|
5.5%
|
2200
|
Word Web App viewing, all cache hits
Similar to above, this simulates performance when every document being requested has already been rendered and is in the Web App cache. Without having to re-render the document, the RPS and throughput increases, and the app server machine is not needed as the web front ends can serve the requests directly. Note that the Bottleneck column is removed, as in each case HTTP throttling is encountered.
Topology
|
RPS
|
Average Response Time
|
Average WFE CPU
|
SQL Server CPU
|
# of active users supported
|
1 WFE
|
24
|
0.15 seconds
|
11%
|
2%
|
990
|
2 WFE
|
33
|
0.25 seconds
|
7.5%
|
2.5%
|
1500
|
3 WFE
|
50
|
0.25 seconds
|
7%
|
3.5%
|
2250
|
4 WFE
|
80
|
0.35 seconds
|
10%
|
4.5%
|
3100
|
5 WFE
|
108
|
0.05 seconds
|
10%
|
7.5%
|
4400
|
Word Web App editing
When editing documents, only a web front end is required. Since heavy processing can happen during editing, there is a spectrum of how much load can be placed on a given set of machines. The ends of this spectrum are represented by the “red zone” and the “green zone”. Deploying the Web Apps and targeting performance characteristics as described in the Green Zone table below is recommended. In situations where you know the front-end Web servers will have very little work other than servicing Office Web App sessions, targeting performance characteristics closer to the “red zone” is reasonable.
Green Zone
Topology
|
RPS
|
Average Response Time
|
Average WFE CPU
|
SQL Server CPU
|
# of active users supported
|
1 WFE
|
285
|
0.03 seconds
|
50%
|
3%
|
240
|
2 WFE
|
292
|
0.04 seconds
|
50%
|
8%
|
540
|
3 WFE
|
330
|
0.06 seconds
|
50%
|
12%
|
720
|
Red Zone
Topology
|
RPS
|
Average Response Time
|
Average WFE CPU
|
SQL Server CPU
|
# of active users supported
|
1 WFE
|
286
|
0.04 seconds
|
75%
|
5%
|
420
|
2 WFE
|
333
|
0.08 seconds
|
74%
|
12%
|
780
|
3 WFE
|
600
|
0.14 seconds
|
75%
|
19%
|
1200
|
OneNote Web App
When editing documents, only a web front end is required. As above, results are given for both the recommended Green Zone performance characteristics as well as upper limits specified by the Red Zone table.
Green Zone
Topology
|
RPS
|
Average Response Time
|
Average WFE CPU
|
SQL Server CPU
|
# of active users supported
|
1 WFE
|
97
|
0.1 seconds
|
50%
|
9%
|
1260
|
2 WFE
|
199
|
0.15 seconds
|
50%
|
19%
|
2520
|
3 WFE
|
275
|
0.5 seconds
|
50%
|
30%
|
3720
|
Red Zone
Topology
|
RPS
|
Average Response Time
|
Average WFE CPU
|
SQL Server CPU
|
# of active users supported
|
1 WFE
|
135
|
0.4 seconds
|
75%
|
12%
|
1700
|
2 WFE
|
250
|
1.0 second
|
75%
|
28%
|
3780
|
3 WFE
|
340
|
1.0 second
|
61%
|
36%
|
5160
|
PowerPoint Web App viewing uncached
When viewing a PowerPoint file in the Web App, the app server is used to render the file into the web viewer’s format. Renders are then placed in the Web App cache.
The details below give an indication of results for a topology where the web front ends and the app server back ends are changed (that is, a 1x2 would be one front end with two app servers, all supported by an instance of SQL Server). The user count is an estimate on the number of users that are actively viewing a document using the PowerPoint Web App that the topology could support.
Topology
|
RPS
|
Average Response Time
|
Bottleneck
|
Average WFE CPU
|
Average App Server CPU
|
SQL Server CPU
|
# of active users supported
|
1x1
|
90
|
0.04 seconds
|
App Server
|
7.3%
|
68%
|
2.1%
|
900
|
1x2
|
140
|
0.045 seconds
|
HTTP throttling on front-end Web server
|
10%
|
58%
|
3%
|
1500
|
2x2
|
158
|
0.047 seconds
|
App Server
|
5.4%
|
62%
|
3.6%
|
1500
|
2x3
|
200
|
0.042 seconds
|
App Server
|
7.45%
|
55%
|
4.7%
|
2100
|
3x3
|
192
|
0.05 seconds
|
HTTP throttling on front-end Web server
|
4%
|
66%
|
5%
|
2000
|
PowerPoint Web App viewing cached
Similar to above, this simulates performance when every document being requested has already been rendered and is in the Web App cache. Without having to re-render the document, the RPS and throughput increases, and the app server machine is not needed as the web front ends can serve the requests directly. Note that the Bottleneck column is removed, as in each case HTTP throttling is encountered.
Topology
|
RPS
|
Average Response Time
|
Average WFE CPU
|
SQL Server CPU
|
# of active users supported
|
1 WFE
|
350
|
0.01 seconds
|
21.1%
|
3%
|
700
|
2 WFE
|
200
|
0.01 seconds
|
7%
|
2%
|
1400
|
3 WFE
|
111
|
0.01 seconds
|
12%
|
2%
|
2100
|
4 WFE
|
180
|
0.01 seconds
|
8%
|
3%
|
2857
|
5 WFE
|
225
|
0.01 seconds
|
5%
|
3.5%
|
3571
|
PowerPoint Web App editing
When editing PowerPoint files in the Web App, both a web front end and application server are required. However, the high majority of the load is on the application server which is memory bound.
Topology
|
RPS
|
Average Response Time
|
Average WFE CPU
|
Average App Server CPU
|
App Server Memory Usage
|
SQL Server CPU
|
# of active users supported
|
1x1
|
48
|
1.18
|
2.8%
|
40%
|
87.5%
|
0.6%
|
600
|
1x2
|
125
|
1.19
|
4.76%
|
37%
|
87.5%
|
1.3%
|
1200
|
1x3
|
142
|
1.28
|
6.58%
|
34.6%
|
87.5%
|
1.3%
|
1800
|
PowerPoint Broadcast (default, “MaxPendingReceives=1”)
When viewing PowerPoint Broadcasts in the Web App, both a web front end and application server are required. Each attendee pings the server every second to determine the broadcast’s state, so RPS is roughly indicative of the number of active users supported.
By default, the web front ends are bottlenecked by the “MaxPendingReceives” setting on the application server. Data is shown for both the default “MaxPendingReceives” setting of 1 and a “tuned” setting of 10. PowerPoint Broadcast usage of web front ends may be throttled if CPU usage is a concern. For more information about performance tuning a server farm for PowerPoint Broadcast, see Configure Broadcast Slide Show performance.
Topology
|
RPS
|
Average Response Time
|
Average WFE CPU
|
Average App Server CPU
|
SQL Server CPU
|
# of active users supported
|
1x1
|
295
|
0.36
|
20.7%
|
37.3%
|
3.7%
|
300
|
1x2
|
590
|
0.32
|
30.5%
|
23.5%
|
1.7%
|
600
|
2x2
|
671
|
0.83
|
18.1%
|
38.45%
|
2%
|
700
|
2x3
|
797
|
0.47
|
26.5%
|
26.5%
|
2%
|
800
|
3x3
|
842
|
0.87
|
19%
|
34%
|
3%
|
850
|
PowerPoint Broadcast ("MaxPendingReceives=10" )
With the “MaxPendingReceives” setting set to 10 in the web.config file (see the guide linked to above for details on how to change this setting), you can see the throughput and supported # of users increases greatly for a given topology. You’ll also see that this places a much heavier load on the CPU of the front-end Web server, which is the tradeoff for the extra throughput.
Topology
|
RPS
|
Average Response Time
|
Average WFE CPU
|
Average App Server CPU
|
SQL Server CPU
|
# of active users supported
|
1x1
|
1070
|
0.16
|
95%
|
37.5%
|
1%
|
1000
|
1x2
|
1024
|
0.17
|
95%
|
12%
|
1%
|
1000
|
2x2
|
1934
|
0.16
|
48%
|
39.5%
|
1.5%
|
2000
|
2x3
|
1823
|
0.15
|
35%
|
20%
|
1.6%
|
2000
|
3x3
|
2779
|
0.12
|
41.5%
|
33%
|
2.2%
|
2800
|
|