Automated Testing

From OpenSimulator

(Difference between revisions)
Jump to: navigation, search
(An Example Test - SQLite Assets)
Line 5: Line 5:
  
 
In OpenSim today we use NUnit tests.  Our conventions are:
 
In OpenSim today we use NUnit tests.  Our conventions are:
1. Tests should '''not''' exist inside runtime assemblies, as this makes nunit a production requirement
+
# Tests should '''not''' exist inside runtime assemblies, as this makes nunit a production requirement
1. Tests should be in .Tests.dll assemblies.  For instance, the tests for OpenSim.Data.SQLite.dll should be in the OpenSim.Data.SQLite.Tests.dll assembly.  This allows for easy removal of test assemblies in products.
+
# Tests should be in .Tests.dll assemblies.  For instance, the tests for OpenSim.Data.SQLite.dll should be in the OpenSim.Data.SQLite.Tests.dll assembly.  This allows for easy removal of test assemblies in products.
1. Tests should be as close to the code as possible, but not intermingled.  So the tests for OpenSim/Data/SQLite should be in OpenSim/Data/SQLite/Tests/.  Through the use of the '''Exclude''' keyword in prebuild.xml you can ensure that directory is part of OpenSim.Data.SQLite.Tests.dll and not OpenSim.Data.SQLite.dll.
+
# Tests should be as close to the code as possible, but not intermingled.  So the tests for OpenSim/Data/SQLite should be in OpenSim/Data/SQLite/Tests/.  Through the use of the '''Exclude''' keyword in prebuild.xml you can ensure that directory is part of OpenSim.Data.SQLite.Tests.dll and not OpenSim.Data.SQLite.dll.
1. Tests should be able to run safely in a production environment.  That means that care must be taken not to damage data on the machine that it is being run.
+
# Tests should be able to run safely in a production environment.  That means that care must be taken not to damage data on the machine that it is being run.
  
 
= Writing New Tests =
 
= Writing New Tests =
Line 52: Line 52:
 
         public UUID uuid2;
 
         public UUID uuid2;
 
         public UUID uuid3;
 
         public UUID uuid3;
 +
        public string name1;
 +
        public string name2;
 +
        public string name3;
 
         public byte[] asset1;
 
         public byte[] asset1;
 
          
 
          
Line 60: Line 63:
 
             uuid2 = UUID.Random();
 
             uuid2 = UUID.Random();
 
             uuid3 = UUID.Random();
 
             uuid3 = UUID.Random();
 +
            name1 = "asset one";
 +
            name2 = "asset two";
 +
            name3 = "asset three";
 +
 
             asset1 = new byte[100];
 
             asset1 = new byte[100];
 
             asset1.Initialize();
 
             asset1.Initialize();
Line 86: Line 93:
 
         public void T010_StoreSimpleAsset()
 
         public void T010_StoreSimpleAsset()
 
         {
 
         {
             AssetBase a1 = new AssetBase(uuid1, "asset one");
+
             AssetBase a1 = new AssetBase(uuid1, name1);
             AssetBase a2 = new AssetBase(uuid2, "asset two");
+
             AssetBase a2 = new AssetBase(uuid2, name2);
             AssetBase a3 = new AssetBase(uuid3, "asset three");
+
             AssetBase a3 = new AssetBase(uuid3, name3);
 
             a1.Data = asset1;
 
             a1.Data = asset1;
 
             a2.Data = asset1;
 
             a2.Data = asset1;
Line 98: Line 105:
  
 
             AssetBase a1a = db.FetchAsset(uuid1);
 
             AssetBase a1a = db.FetchAsset(uuid1);
             Assert.That(a1.ID, Is.EqualTo(a1a.ID));
+
             Assert.That(a1a.ID, Is.EqualTo(uuid1));
             Assert.That(a1.Name, Is.EqualTo(a1a.Name));
+
             Assert.That(a1a.Name, Is.EqualTo(name1));
  
 
             AssetBase a2a = db.FetchAsset(uuid2);
 
             AssetBase a2a = db.FetchAsset(uuid2);
             Assert.That(a2.ID, Is.EqualTo(a2a.ID));
+
             Assert.That(a2a.ID, Is.EqualTo(uuid2));
             Assert.That(a2.Name, Is.EqualTo(a2a.Name));
+
             Assert.That(a2a.Name, Is.EqualTo(name2));
  
 
             AssetBase a3a = db.FetchAsset(uuid3);
 
             AssetBase a3a = db.FetchAsset(uuid3);
             Assert.That(a3.ID, Is.EqualTo(a3a.ID));
+
             Assert.That(a3a.ID, Is.EqualTo(uuid3));
             Assert.That(a3.Name, Is.EqualTo(a3a.Name));
+
             Assert.That(a3a.Name, Is.EqualTo(name3));
 
         }
 
         }
  
Line 120: Line 127:
 
}
 
}
 
</source>
 
</source>
 +
 +
You can see 4 of the important annotations here:
 +
* TestFixture - this class is a test suite
 +
* TestFixtureSetup - this code is always run before any of the tests are executed
 +
* TestFixtureTearDown - this code is always run after the tests are done executing, even if they fail.
 +
* Test - this method is a test
 +
 +
=== Setup / Teardown ===
 +
In the case of testing something like the database layer, we have to actually attempt to store / retrieve things from a database.  Following from rule #4 of good tests, we want to make sure not to touch the production databases to run our tests, so during startup we generate a temporary file name which is guaranteed not to be an existing file on the system, and use that as our database file name.  By running db.Initialize() the OpenSim migration code will correctly populate that database with the latest schema.
 +
 +
Once we are done with the tests we want to make sure we aren't leaving garbage temp files on the user's system.  So we remove that file we created.
 +
 +
During setup we also create a set of state variables, such as 3 uuids, 3 strings, and a data block.  You could have always just stuck these inline, but variables are there for a reason, so use them.
 +
 +
=== Simple Negative Tests ===
  
 
== Integration into the Build Tree ==
 
== Integration into the Build Tree ==

Revision as of 11:58, 2 October 2008


Contents

Overview

As OpenSim matures, we are extremely interested in adding more automated verification into the OpenSim source tree. Test exist not only to prevent bugs from creeping in, but also to provide fair warning that the behavior of the system has changed, and tests may need updating.

In OpenSim today we use NUnit tests. Our conventions are:

  1. Tests should not exist inside runtime assemblies, as this makes nunit a production requirement
  2. Tests should be in .Tests.dll assemblies. For instance, the tests for OpenSim.Data.SQLite.dll should be in the OpenSim.Data.SQLite.Tests.dll assembly. This allows for easy removal of test assemblies in products.
  3. Tests should be as close to the code as possible, but not intermingled. So the tests for OpenSim/Data/SQLite should be in OpenSim/Data/SQLite/Tests/. Through the use of the Exclude keyword in prebuild.xml you can ensure that directory is part of OpenSim.Data.SQLite.Tests.dll and not OpenSim.Data.SQLite.dll.
  4. Tests should be able to run safely in a production environment. That means that care must be taken not to damage data on the machine that it is being run.

Writing New Tests

Writing a new unit test is pretty easy, and very helpful in helpful in increasing the stability of opensim by nailing down bugs. I'm going to present an example here of SQLite Asset testing to show how simple such a test case is to write. The actual in tree SQLite Asset tests are a little different because the code was factored out so that it was easily applied to any database driver, so don't be concerned with the fact that what you see here isn't in the tree.

NUnit Conventions

An NUnit test suite:

  • is a class with a default constructor (takes no arguments)
  • has public methods that are tests
  • uses annotations to determine what are tests
  • runs it's tests in alphabetical order by method name

An NUnit test method:

  • must be public
  • must return void
  • must take no arguments
  • is successful if no exception or assert is thrown while running it

The run order is important if you want to have early tests that setup some complicated state (like creating objects), and have later tests remove or update that state. For that reason I find it very helpful to name all test methods Txxx_somename where xxx is a number between 000 and 999. That guaruntees no surprises in run order.

An Example Test - SQLite Assets

using System;
using System.IO;
using System.Collections.Generic;
using NUnit.Framework;
using NUnit.Framework.SyntaxHelpers;
using OpenSim.Framework;
using OpenSim.Data.Tests;
using OpenSim.Data.SQLite;
using OpenSim.Region.Environment.Scenes;
using OpenMetaverse;
 
namespace OpenSim.Data.SQLite.Tests
{
    [TestFixture]
    public class SQLiteAssetTest
    {
        public string file;
        public string connect;
        public AssetDataBase db;
        public UUID uuid1;
        public UUID uuid2;
        public UUID uuid3;
        public string name1;
        public string name2;
        public string name3;
        public byte[] asset1;
 
        [TestFixtureSetUp]
        public void Init()
        {
            uuid1 = UUID.Random();
            uuid2 = UUID.Random();
            uuid3 = UUID.Random();
            name1 = "asset one";
            name2 = "asset two";
            name3 = "asset three";
 
            asset1 = new byte[100];
            asset1.Initialize();
            file = Path.GetTempFileName() + ".db";
            connect = "URI=file:" + file + ",version=3";
            db = new SQLiteAssetData();
            db.Initialise(connect);
        }
 
        [TestFixtureTearDown]
        public void Cleanup()
        {
            db.Dispose();
            System.IO.File.Delete(file);
        }
 
        [Test]
        public void T001_LoadEmpty()
        {
            Assert.That(db.ExistsAsset(uuid1), Is.False);
            Assert.That(db.ExistsAsset(uuid2), Is.False);
            Assert.That(db.ExistsAsset(uuid3), Is.False);
        }
 
        [Test]
        public void T010_StoreSimpleAsset()
        {
            AssetBase a1 = new AssetBase(uuid1, name1);
            AssetBase a2 = new AssetBase(uuid2, name2);
            AssetBase a3 = new AssetBase(uuid3, name3);
            a1.Data = asset1;
            a2.Data = asset1;
            a3.Data = asset1;
 
            db.CreateAsset(a1);
            db.CreateAsset(a2);
            db.CreateAsset(a3);
 
            AssetBase a1a = db.FetchAsset(uuid1);
            Assert.That(a1a.ID, Is.EqualTo(uuid1));
            Assert.That(a1a.Name, Is.EqualTo(name1));
 
            AssetBase a2a = db.FetchAsset(uuid2);
            Assert.That(a2a.ID, Is.EqualTo(uuid2));
            Assert.That(a2a.Name, Is.EqualTo(name2));
 
            AssetBase a3a = db.FetchAsset(uuid3);
            Assert.That(a3a.ID, Is.EqualTo(uuid3));
            Assert.That(a3a.Name, Is.EqualTo(name3));
        }
 
        [Test]
        public void T011_ExistsSimpleAsset()
        {
            Assert.That(db.ExistsAsset(uuid1), Is.True);
            Assert.That(db.ExistsAsset(uuid2), Is.True);
            Assert.That(db.ExistsAsset(uuid3), Is.True);
        }
    }
}

You can see 4 of the important annotations here:

  • TestFixture - this class is a test suite
  • TestFixtureSetup - this code is always run before any of the tests are executed
  • TestFixtureTearDown - this code is always run after the tests are done executing, even if they fail.
  • Test - this method is a test

Setup / Teardown

In the case of testing something like the database layer, we have to actually attempt to store / retrieve things from a database. Following from rule #4 of good tests, we want to make sure not to touch the production databases to run our tests, so during startup we generate a temporary file name which is guaranteed not to be an existing file on the system, and use that as our database file name. By running db.Initialize() the OpenSim migration code will correctly populate that database with the latest schema.

Once we are done with the tests we want to make sure we aren't leaving garbage temp files on the user's system. So we remove that file we created.

During setup we also create a set of state variables, such as 3 uuids, 3 strings, and a data block. You could have always just stuck these inline, but variables are there for a reason, so use them.

Simple Negative Tests

Integration into the Build Tree

Good / Bad Test practices

Running Tests

Goal

To create a system of automatic building, testing, and simulation so we can have better awareness of our changes to OpenSim.

Summary

To properly test OpenSim, we will need to address a number of objectives:

  • Crossplatform Support
  • Build Stability
  • Code Test
  • Code Performance
  • Runtime Performance
  • Code Style
  • Web site results
  • Communication

We will be using a number of in-house technologies as well as 3rd party (NUnit, Nant, etc.) to address our challenges. We will strive to maintain all our testing tools in a package that any one can use on any computers with any operating system. For convience, a centralized site for this testing result information on opensim will be created on http://testing.ambientunion.com .

People

  • Daedius Moskvitch ( daedius @@@@ daedius.com)

Testing Objectives

Crossplatform Support

There are a number of objectives that should report on which operating system they ran

  • Build Stability - Make sure all our builds compile on every operating system
  • Code Test - Make sure all our tests run on every operating system
  • Code Performance - See how our functions run on every operating system
  • Runtime Performance - See how operating systems actually run opensim

Build Stability

Ensure that the build is intact.

Code Test

Use NUnit, NCover

Code Performance

Need to ponder

Runtime Performance

Statistics Server

Code Introspection

Use Gendarme

Web site results

Results of our testing should be accessible web service. This web service will be available via a maintained plugin for build bot so when running a build bot master, one only need a website that can access the web service and properly show it how they wish. Our web service needs to be able to hand off a variety of information:

  • Realtime status information of operations
  • Crossplatform Support information on tests that have been run for some spiffy grids
  • Build results for showing us what happens when things break
  • Code test results for showing us what tasts ran
  • Code performance for showing us how all our profiled code worked
  • Runtime Performance statistics information for bar graphs and charts
  • Code Style results for showing which code structure names are non-standard

Communication

We should be able to notify the right people of various important changes (or not so important if they wish) via Email, IRC, etc.

Technologies

Personal tools
General
About This Wiki