[Opensim-users] Making A 30region clone to a 30 sector Mega-region

John Mieske johnmieske at gmail.com
Sat Jan 30 07:21:21 UTC 2010


Ok I have almost finished the Debian 5 OpenSim 6.8 ( Post Fixes ) + Mono
2.4.2.3 install and setup tutorial. I have included info on installing MySQL
and PHPMyAdmin and Apache2. More on that as soon as I finish it up which I
should have ready in a couple days. I also got GROUPS to work with my Grid
and it seems to be stable. So i'll write up that second part as soon as I
can.

Hope this will help all who needs this.

John Mieske


On Fri, Jan 29, 2010 at 8:50 PM, John Mieske <johnmieske at gmail.com> wrote:

> I agree and I have already tested this.. it seems to work, but at the time
> I had some other issues, so I was not sure..
>
> And i agree, NEVER run the client on the same computer as the server / grid
> / standalone. More issues arise because of that. I am also rewriting the
> Debian 5 tutorial. Once I get this finished, i'll find somewhere I can post
> this for all.
>
> John
>
>
> On Fri, Jan 29, 2010 at 5:30 PM, Jane Foxclaw <janefoxclaw at gmail.com>wrote:
>
>> Are you running
>> OpenSim.Grid.UserServer.exe
>> OpenSim.Server.exe
>> OpenSim.Grid.MessagingServer.exe on one computer and the
>> OpenSim.exe on another computer ?  That would seperate the two so that one
>> computer wont be overloaded.  Just wondering if its even possible ?
>>
>>
>> On Fri, Jan 29, 2010 at 2:38 AM, Master_Mirage <mirage123 at verizon.net>wrote:
>>
>>>
>>> Yes it was done while in full grid mode ROBUST and MYSQL services. The
>>> computer is lan connected to the GRID services and co-exist w. my public
>>> grid. I dont think it matters that mutch as the the process of what i did
>>> would still be the same. I havent tryed it in standalone mode, i think
>>> the
>>> setps would be the same. You would need to be using MYSQL server and not
>>> on
>>> the same box your running the instance on. Mysql will use alot of the
>>> instance's resources and the instance needs every drop of computing power
>>> it
>>> can get! Also you should not run your client on the same box as your
>>> instance is on, not of this size anyway for the same resion.
>>>
>>> Jane Foxclaw wrote:
>>> >
>>> > For me, I am more or less curious not in your Regions.ini file.. but
>>> your
>>> > OpenSim.ini and OpenSim.Server.ini if your running in Grid mode.
>>> someday I
>>> > have a dream of running Grid mode with hyperGrid turned on..  hahahaha
>>> >
>>> > sonya
>>> >
>>> > On Thu, Jan 28, 2010 at 11:50 PM, Master_Mirage
>>> > <mirage123 at verizon.net>wrote:
>>> >
>>> >>
>>> >> I have managed to clone 30 regions that are allready developed using
>>> OAR
>>> >> copys. I thought there may be interest in the steps i took that
>>> worked.
>>> >> The 1st thing i did was to layout a 5x6 mega region on its own
>>> instance.
>>> >> Dont combine at this point. Use fresh region UUIDS and new names for
>>> >> each.
>>> >> I
>>> >> then went and set each region's land settings and media settings as
>>> well
>>> >> as
>>> >> the land textuers. I chose to do a terrain fill 23 as well. Note: Its
>>> >> best
>>> >> to not set autoreturn at this point. The i made an oar i called
>>> blank.oar
>>> >> (With luck you wount ever need it).
>>> >> 2. Next was to restore the OARs i made of the regions i wanted to
>>> clone
>>> >> and
>>> >> be part of the megaregion. (30 in this case). Restore them slowly
>>> making
>>> >> shure the asset server keeps up with the OAR. Its tempting to rush but
>>> a
>>> >> bad
>>> >> idea as i found out on the 1st try of this!
>>> >> 3. Login and check each region carefully for any oops w. the oar
>>> restores
>>> >> (ignore ground textuers thay will end up changing to what you used in
>>> the
>>> >> 1st step not what is in the oar after the instance is restarted as
>>> >> combined). At this point should you see anything wrong this is your
>>> last
>>> >> change to fix it, if so simply reload the oar again or make a new one
>>> and
>>> >> resore (sometimes the OAR may not be right). Once i was shure all was
>>> >> well
>>> >> i
>>> >> shut the instance down and reloaded normaly. I did another quick check
>>> of
>>> >> the regions. Remember this is your last chance to fix anything major!
>>> So
>>> >> take your time and look.
>>> >> 4. Shut it down again and change the setting in the ini. and restart
>>> >> it will take it about 5x longer than normal to load than it did
>>> normaly.
>>> >> You
>>> >> will see alot of errors about oject crossing boundrys for a bit (dont
>>> >> panic). In my case the 30 regions held around 80k objects and a few K
>>> in
>>> >> scripts. Mixed bag of stuff from simple to complex objects. Eventualy
>>> the
>>> >> errors stop and a bit after finnishes the loading.
>>> >> Its ok to log in at this point and have a look around, you will see
>>> >> somethings may be missing or phantom
>>> >> that may not be case though. Alot of things can make it look like that
>>> >> but
>>> >> realy there ok. Client cache and other things can make it seem like
>>> that
>>> >> too. The phntom prims are fixed w. a console command fix phantom
>>> objects.
>>> >> Its a slow process in my case due to how many objects there are. In my
>>> >> case
>>> >> it did fix all the phantom stuff as well as missing prims (not shure
>>> why
>>> >> but
>>> >> thay poped into view anyway).
>>> >> I checked each region 1 by one noting any goof's down (verry minor
>>> ones
>>> >> in
>>> >> my case).
>>> >> I shutdown the instance again. I clear my client cache and changed the
>>> >> instance caching from cermone to flotsum. I thought it best not to
>>> start
>>> >> out
>>> >> with flotsum knowing this would be massive.
>>> >> Restarted the instance (yes you will still see the object boundry
>>> >> crossing
>>> >> errors and it still takes awhile to load it all).  By doing it in this
>>> >> order
>>> >> i was happy to have a working 30region to 30sector MegaRegion.
>>> >> There are alot of tweeks that should be done in the ini but thats
>>> another
>>> >> post totaly.
>>> >> There is a side effect to this if you still run your old regions that
>>> may
>>> >> have had parcles. Thay are Copyed not cloned! and will revert to
>>> >> unparcled
>>> >> and back to the region owner. The clone's MEGAREGION parcles will be
>>> ok
>>> >> though. Why that is im not shure yet.
>>> >> As this was a TEST and dident actualy expect it to work at all. I was
>>> >> happy
>>> >> with it. If anyone should try this Keep it in mind that theres 1000
>>> >> things
>>> >> that can go wrong doing this and i am just saying what I did that gave
>>> ME
>>> >> a
>>> >> happy result.  Please feel free to add to this or any suggestions that
>>> i
>>> >> may
>>> >> have glossed over here.
>>> >> --
>>> >> View this message in context:
>>> >>
>>> http://n2.nabble.com/Making-A-30region-clone-to-a-30-sector-Mega-region-tp4478270p4478270.html
>>> >> Sent from the opensim-users mailing list archive at Nabble.com.
>>> >> _______________________________________________
>>> >> Opensim-users mailing list
>>> >> Opensim-users at lists.berlios.de
>>> >> https://lists.berlios.de/mailman/listinfo/opensim-users
>>> >>
>>> >
>>> > _______________________________________________
>>> > Opensim-users mailing list
>>> > Opensim-users at lists.berlios.de
>>> > https://lists.berlios.de/mailman/listinfo/opensim-users
>>> >
>>> >
>>>
>>> --
>>> View this message in context:
>>> http://n2.nabble.com/Making-A-30region-clone-to-a-30-sector-Mega-region-tp4478270p4478700.html
>>> Sent from the opensim-users mailing list archive at Nabble.com.
>>> _______________________________________________
>>> Opensim-users mailing list
>>> Opensim-users at lists.berlios.de
>>> https://lists.berlios.de/mailman/listinfo/opensim-users
>>>
>>
>>
>> _______________________________________________
>> Opensim-users mailing list
>> Opensim-users at lists.berlios.de
>> https://lists.berlios.de/mailman/listinfo/opensim-users
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://opensimulator.org/pipermail/opensim-users/attachments/20100130/d9812125/attachment.html>


More information about the Opensim-users mailing list