Saturday, March 13, 2010

The Virtual Reality continuum

Current futurists, scientists and philosophers consider that many ideas attributed usually as Si-Fi will happen eventually some day, due to the exponential growth of the computational power of the computers. This includes autonomous robots and virtual agents , fully immersed virtual and augmented reality, mind upload and many others.

The virtual reality technologies used these days are not capable yet to generate fully immersed virtual environments. The immersion concept describes the degree of reality, which the user experiences while perceiving and interacting with the virtual environment. The immersion effect is rather a psychological term and is hard to be measured formally. People tend to understand different things behind the term and provide inconsistent answers to questions about the degree of immersion in result of a virtual reality interaction.
It is obvious, however that the immersion effect is dependent on the technologies used to feed the virtual reality stream into the human brain.

When talk about virtual reality it is often considered as part of a broather concept called virtual reality continuum , which includes different combinations between the real environment and artificially generated one. Paul Milgram proposes a taxonomy model , describing the virtual reality continuum:

image

The original diagram here is “augmented” by me with the relation between the Virtual Reality continuum and the mobility continuum. A major challenge in front of the virtual reality technologies is the requirement for greater computational power , which requires bigger devices, and more consumption power leading to less mobility. The applications positioned closer to the real environment in the VR continuum requires greater mobility.


Virtual reality

In general the Virtual Reality(VR) concept describes computer generated environment, where the agent(a person) is immersed into. The agent (usually) interacts with the virtual environment to perform different tasks. There are many applications and technologies in this field,already. The VR concept is used in many computer games, MMORG, military and education areas.

Augmented reality

The augmented reality (AR) term describes concept, where the real world is augmented by virtual objects perceived by the observes. The virtual objects are perceived by the observer as they are part of the real environment.He (observer) not only see the virtual objects, but also may interact with them.
AR technologies may be used in engineering, education ,entertainment ,medicine and in any other aspect of our lives. The AR idea is not new and there are companies and universities in the field for a long time.

Total Immersion Demo on Sky News

Lego Augmented Reality


Most of the potential applications for AR solutions requires the AR to happen, while the observer operates (and move) in the real environment, which requires mobile displaying technologies.
There is a great boost in the mobile industry lately and most of the businesses are looking toward enriching their mobile portfolio or presence in the mobile world. We have an increased computational power of the current mobile CPUs/GPUs (Snapdragon Qualcom QSD8x50 , Tegra) and it pushes the AR hype to grow very fast. They (mobile devices) turned to be a candidate AR platform capable to merge the virtual and real and feed it into the human brain (still using the our standard vision channel)

It is interesting that according to Juniper Research the market of AR apps is estimated to $2 million currently and could grow to $732 million in 2014.

Metaio ,Total Immersion, AcrossAir,Layar, Wikitude, Yelp, PresseLite to name a few are companies providing apps using some kind of AR functionality. Check out their web sites to see demo videos of AR apps on mobile devices.

Augmented Virtuality

The latest announcement from Microsoft’s Bing Streets View is a good example (still not very immersive :)) for Augmented Virtuality (e.g. computer generated environment enhanced with real-world images )

The Augmented Virtuality concept is about computer generated virtual environment augmented by physical objects. Physical objects are presented in the virtual environment by virtual copies of the physical objects. The users of the system may interact with physical objects in real time by manipulating their virtual copies. Some authors talk about Mirror Worlds , which seems to me is just a private case of Augmented Virtuality.

We are living in an interesting times, where the computational power of the computers grows in an exponential rate (according to the Moore’s low) and we will experience a real mixed environment, soon. Eventually years from now the words “virtual” and “real” will just vanish from our language just because they will be undistinguishable from our plugged-in nervous systems.

Tuesday, October 23, 2007

Are we going to fail RL?

This is an edited version of my older post. A just added some adidtional thoughts here...

I’m not a scientist, but I love to speculate about different matters in the focus of the science. This pseudo-scientific-emotionally-loaded-speculative essay will shed light over some behaviors demonstrated by the SL residents, which may be considered as irrational.
Futurists often talk about the challenges, which the human race will face during the long journey towards the Space colonization. They predict that these challenges will force a new biological and psychological shift as a response of the extreme physical conditions in the Space. However, there is yet another evolution path for us to walk – virtual world colonization. The 3D Virtual worlds are getting more and more popular these days. They will eventually replace the Web 2.0 hype as a platform for social and business interactions. People will start to look on to the virtual worlds as a new residential place much alike the Americas in past or Space in the future. This journey has already started in long ago, since the presence of APRANET
Second Life is one of the major players in the virtual world scene. It has a number of unique features, however the most significant one is its community driven nature. Second Life is self organizing and changes in result of the community interactions. It is considered as place, where everything may happen - the limit is our imagination. We believe that imagination is just another word for infinity, aren’t we?
There are anthropological studies over the psychological limitations developed in result of characteristics existing in our genotype and phenotype. We tend to develop specific psychological constructs, seen as a direct result from our biological characteristics. These psychological constructs then are manifested as behaviors. For the purpose of this speculation, I will talk about the biological characteristics as biological limitations. We may consider the behaviors influenced by these biological limitations as a rational. We just can’t do it another way!
L'anatomie, c'est le destin!
The phrase means "anatomy is a destiny" and is usually attributed to Freud. The original phrase belongs to Napoleon. We don’t jump from skyscrapers, because we can’t fly. We are afraid as the height goes up because of our flight limitations. We tend to approach a person closer to talk, because our voice is distance limited. We feel jealous, when our intimate partner is approached by another individual, because our reproductive path is treathen.We can’t live in water without equipment. However, there are behaviors, where the relation to specific biological properties is hard to be seen. Sometimes the direct relation is missing just because the biological limitation is already a history. For example adult people may demonstrate behaviors developed during the infancy (when there are biological limitations presented), although the biological limitation is not presented, anymore. Such behaviors may be considered as irrational.
It is quite challenging from a scientific point of view to develop a clear behavior classification based on the rationale-irrational axis in the real life. However, Second Life is still far more limited in terms of possible states. It is still described by a limited number of rules, which allows researchers to operate over them with more formal tools. This makes the rational-irrational behavior classification in Second Life much easier. The “limited number of states” characteristic renders Second Life as a useful scientific playground- the behavioral context is much simple and suitable for formal representations. However, these research possibilities raise a various ethic questions.
Just because, the very soul of Second Life is the human imagination, we may expect to see it expressed in a ways as much as the number of the residents. Residents build places from the past, from the future, from the tales, from movies and so on. Residents build objects, write scripts, perform live, attend to events or just socialize. It seems like there are tons of different expression forms in Second Life – no one is the same.
However, if one spent a little time to watch residents for specific behaviors, he will discover a strange phenomenon. This phenomenon is strange if one believes that human imagination is a true “infinity” synonym. Residents tend to demonstrate repeating irrational behavioral forms in similar situations. They are irrational in the Second Life context, because the underlying biological limitation is just missing in the virtual world. I will try to give some examples:
• Line of Sight
Tons of public events (concerts, theaters, educational classes,presentation) are hosted every day in SL. It is not rare to see folks struggling to take a front place near the stage. In real life we’re doing this, because our effective sight and hearing distance is limited. This tendency is observable into individuals, who know that SL provides instruments to watch around the avatar to a far greater distance.

• Physical Space Distribution
Go to public places in SL and watch people talking. You will see a tendency people to distribute the virtual space based on their feelings and goals toward specific peers. People try to preserve the right distance, when talking with peers. People try to stand in front of the peer as it would be done in RL. People stand aside, when feel uncertain or socially incapable. Avatars may communicate in SL (technically) without using these common social techniques, but people use them.
In RL, we tend to distribute our physical space according to our psychological space. It may be observed easily in closed spaces like rooms. We stand closely to our intimate partners. We stand on a limited distance from unknown people and far from our enemies. We tend to keep a descent distance from formal peers during conversation. We take a place near the corner in unknown and unstructured situations. The distance and place are more likely to reflect our current emotional state, our fear and uncertainty, our goals. There are plenty of local social-cognitive theories, which try to explain the physical space distribution in result of specific psychological properties. We still demonstrate this type of behaviors, although SL provides communication tools, which are distance

• 3D Space Distribution
This one may be considered as yet another Psychological Space Distribution phenomenon. In real life we live closer to the ground. We have skyscrapers and towers, but they are built on the ground. Going higher and higher, the discomfort levels starts raising. While in real life, we may localize specific biological limitations and corresponding psychological constructs, which make the sky inhabiting difficult, in SL it is slightly different. SL world is a true 3D (still virtual) space. Avatars may fly up to 170 m, alone. The maximum height, to which avatar may fly with the help of a vehicle is about 4000m (there are ways to push avatar up to million meters, actually). Buildings do not have to be built on the ground and may hang up in the air up to 650-700 m from the ground. In general SL residents may live high above the ground if they want. They may fully utilize the true 3D space in the virtual world – and it is still a dream in the real life in 21 century. There are a lot of buildings and platforms in SL built high above the ground. However the majority of the population inhabits the ground levels. I see it as a result of our specific psychological constructs formed in the physical world.

• Attribution
Attribution is a social psychology concept. It expresses the tendency people to attach stable traits to events and people. We always try to explain the world around us. We may think of attribution is another word for explanation. When something happens, we make attribution about it and then the attribution influences our behavior. There are various studies over the attribution phenomenon. People tend to attribute taller people as smarter, older people as wiser, men as better drivers, etc.
In RL the physical attractive individuals receive more sexual attention than individuals considered as non-attractive. One possible explanation (biologically inspired) for this tendency is the assumption that the physical attractiveness is an evidence for a biological health and eventually such individual will “produce” healthy off springs. We may think of it like a rational behavior in RL.
Residents in SL may choose their avatars shape. They may look sexy, ugly, scary or whatever they like. Although, there is no direct connection between the actual shape chosen and the individual psychological characteristics behind the avatar (it is not trivial to see it, at least), we tend to make attributions for the individual behind the avatar, based on its virtual appearance. We automatically attribute the attractive avatars as smart, funny, trustful, skillful on first sight, without actually knowing the real individual behind it. We start acting towards the target avatar according to the attributions we’ve made. It’s not hard to find posts in SL forums, which complains about the disappointments from in-world relations. I see them as a reflection of the human attributive capabilities, which are inadequate to the virtual reality characteristics.
From this perspective, being a human means to have irrational psychological limitations which prevent us from being as much effective as the virtual world allows. Being a human means to feel better, when talking eye-to-eye with your partner and be afraid from falling down.

The technology boost in areas like Full-Immersion Virtual Reality, Augmented Virtual Reality and Biotechnologies will increase the potential and the challenges for us as virtual residents. Sensing and acting in virtual worlds will not differ sensing and acting in RL.
Evolutionary psychology claims that the behavioral traits development in the species is based on the same evolution principals as the evolution of the biological properties. It considers the environment as a central driving force behind the evolution process. The increased environment challenges in the virtual space due to the future massive technology advances will trigger human race to evolve new behavioral traits in order to adapt to the virtual space demands.

The intrinsic human psychological properties, which build our identity, will be replaced. We will soon face the new human kind - virtual human, adapted to meet the virtual reality challenges.
Is the virtual human going to fail in RL trying to use its virtually evolved behaviors, then?

Wednesday, September 19, 2007

Virtual Affairs LTD and the VA Security System

I almost setup my new place called Virtual Affairs LTD.This is the place, where I will run my business from.
The first thing(after the more important ones) to do for my place was to create a security system for it. My first goal was to place a number of doors around the office to restrict(or allow) the physical acces to specific areas (customers area, stuff area, owner area, etc.)
I had the following requirements at first:

1.Functional:
1.1. Having multiple doors placed around my office
2.2. Controlling all the doors remotely within my place- one by one or all at once
2.3. Delegate rights to my security officers to control the system
2.4. Controlling the access rights from a single point for the whole system.
2.5. Monitoring doors states and the passing avatar names

2. Technical
- 2.1. Avoid performance penalties due to the built-in LSL functions delays
- 2.2. Avoid performance penalties in large setups (many many doors)
- 2.3. Extensibility of the systems in specific points: Custom monitoring devices, Custom physical restriction devices(doors, windows, pikes, etc.)

So, I spent few days and finally created my security system and called it "The VA Security System". In result of the functional requirements I have now a system ( which actually turns out to has more features than I originally planned), which allows me to create access rules on from avatar/door level to everyone/everydoor level. It allows me to easily delegate control rights to my stuff and controlling and monitoring the system from anyplace within my office.

As we have built-in delays in LSL script functions , I had to find a strategy to overcome this problem especially, when dealing with server-door and server-monitor communications(2.1).
The good news for me was that the LSL developer may utilize a kind of asyncrhonous-processing strategy by using multiple scripts into a single prim and llMessageLinked function.
The other technical problem(2.2.) was to reduce lagging, when dealing with large number of connected doors and huds. This was more about considering the overall architecture of the system. I had a functional requirement to have a single access control point - e.g. one place to configure, who may have access and who may not have access. My first implication was that this requirement forces me to utilize a highly-centralized architecture, where the core layer(server) will be the bottom neck of the system from a performance point of view. Fortunately, I was able to take another path and use a more distributed architecture. The system components are slighly more complicated, however the server component is far more simple and fast.
The interesting part here is that the system is still highly centralized from a functional point of view(e.g. the user poing if view). It is still controlled from a single place, but the core functionality is actually distributed under the hood, between the system components.

I installed it on my place and conducted some tests for few days. Then I decided to release it commercially. It is published on slexchange.

It is the first release of the system and I'm more focused on creating a robust and extensible core for the system. I plan to add more features as well in order to get tighter control over my place.
I provide installation assistance and on-going support for the system as well.
Want to test the system? There is a live demo in my office.

Monday, September 17, 2007

SL Scripter/Builder: Random thoughts on LSL and Scripting Quality Attributes

I'm digging lately into the LSL jungle. So there were a number of things that came to my attention: I will summarize some of them:

  • LSL scripts are finite state machines in general. It's very intuitive trying to map the states from the problem doman to states in LSL. Let's say, we have to develop a door , which have to open when avatar is detected and then closed automatically. From the problems doman(a door) we know that the door may be open or closed , e.g. we have two states: OPEN, CLOSED. Using LSL will looks like that:

    default
    {
    state_entry()
    {
    state closed;
    }
    }

    state open
    {
    state_entry()
    {
    //allow avatars to get through the door
    llSetStatus(STATUS_PHANTOM,FALSE);
    //make it invisble
    llSetAlpha(ALL_SIDES,0.2);
    //hook to the sensor event
    llSensorRepeat("",NULL_KEY,AGENT,SCAN_RANGE,PI,SCAN_RATE);
    }

    sensor(integer num_detected)
    {

    }

    no_sensor()
    {
    //close the door if nothing is detected
    state close;
    }
    }

    state closed
    {
    state_entry()
    {
    //block avatars to get through the door
    llSetPrimStatus(STATUS_PHANTOM,TRUE);
    //make it visible
    llSetAlpha(ALL_SIDES,1);
    //hook to the sensor event
    l lSensorRepeat("",NULL_KEY,AGENT,SCAN_RANGE,PI,SCAN_RATE);
    }

    sensor(integer num_detected)
    {
    //open the door by using state transition
    state open;
    }
    }

    Note that event hooks(llListen,llSensor,llSetTimerEvent, etc) are valid within the state they are started. State transitions make event hooks unavailable. If you want the door to respond to chat commands, you should invoke llListen and implement listen() event on every state. However you may place the listen logic in a global routine and invoke it in every state's listen event in order to avoid duplicate code sections.

  • Note that llSensor and llSensorRepeat support event no_sensor . It may be used to trigger processing, when nothing is detected. Event sensor is raised only when something is sensed. In other words the following snipset is useless:


    sensor(integer num_detected)
    {
    if(num_detected>0)
    {
    //something is sensed, process it here
    }
    else
    {
    //script flow will never reach here.
    }
    }

    instead use the following one:
    ...
    llSensorRepeat("","",SCAN_RANGE,PI,SCAN_RATE);
    ...

    sensor(integer num_detected)
    {
    //num_detected is always greater than 0, e.g. something is detected
    }

    no_sensor()
    {
    //nothing is detected withing the SCAN_RATE seconds
    }
  • Faciliating communication between multiple scripts in a single prim should be done via llLinkedMessage. Routines llSay and llRegionSay do not work in this case
I've looked into a lot of scripts published into the Script Library , freebies and commercial objects, lately. I found it strange that most of them lack of coding practicies, which are considered as must for the software industry these days. Some of them are:
I'm aware that LSL has its own limitations and that these practices on max will increase the memory and performance preassure during runtime. However, this may not be an argument to fully forget about these concepts. We have to find the balance between maintability and usabilty.
I just don't understand, how a quality product and customer service(in-world) with a longer life cycle may be provided, when the unerlying code base does not have basic quality attributes.

Friday, September 14, 2007

The virtual super human is near

Different scientific areas these days provide predictions about the future evolution of the human race. I actually don't beleive(like many others) there will be a major human race evolution shift in terms natural biological change.
I do believe in human race evolution, however, it will be an evolution due to massive technological advances in Biotechnologies, Nano technology and the ITC sector. These advances will bring us many "goodies" mainly through body implants, nanobots and the increased computational power.
The trend to inrease the computational power is a fact for 50 years, now(Moore's law). We double the computational(and communication) power 2 times every 2 years. There are evedences, that many hard scientific and technological problems will be solved, when specific computational power is achieved. Such tasks are human brain reverse engeneering and simulation, artifical intelligence, DNA sampling, quantium level reverse engeneering and simulations, arttificial intelligence
The super human - result of the next evolution shift will eventually be a result of an artiffical biological and tehcnological changes. What these may look lile? Some of the candidate technologies are nanobots, body implants, artificial organs - both artificially grown biological organs and pure artificial ones. We all know at least one movie or book, where they talk about cyborgs. This a reality already.

  • Comminucation on digital level
    The communication with external devices may flow on a digital level. Our brain implants or nano bots will encode the neural activities into a digital based protocol suitable for communication with external computer based devices. May be encoding/decoding will be needed for legacy device communication. Future devices may have a standard naural communication adapter much like the many USB based devices today. They will directly be able to encode/decode our neural signals.

  • Improved brain characteristics: memory capacity, information processing speed
    The latest advances in the brain reverse engeneering and knowledge for the neural cell chemistry will produce brain implants to enhance our memory capacity. These memory brain implants will increase our ability to learn, revoke stored information and even increase the our speed of information processing. Have you ever watched the Johnny Mnemonic movie?

  • Augmented Virtual Reality
    With the help of implants and nanobots we will see the environment as a combination from virtual and real world data. We won't need a PDA to see our calendar. We won't need a screen to see a city map or GPS information. Very similar to the way we operate in Second Life now. We may recall the map of the wolrd. We see the name of the person above his head.
    Read more...

  • Full-Immersion Virtual Reality
    Virtual Reality will not be virtual, anymore. Virtual worlds imput(visuals, auditory,sensorial) will be fed up directly into the corresponding neural circuits. Nano bots and implants will supress the senses from the real world and replace them with the virtual input. They will block the neural commands to our muscles and feed them back to the virtual world where will be interpreted as virtual actions. Want to go back to real world? Just turn the nanobots off. Say SL to plug in. Say RL to plug off.
    Red more...

  • True empathy
    What about feel the same as others? With the help of neural signals encoding/decoding we may be able to know each other better.


  • Improved senses and new sensor modalities
    We're limited to sense the world through sensors for auditory(20-20KHz range), visuals(350-700Nm range ) , touch,gravity, chemical , etc. Nanobots and implants may give us improved auditory:longer distance, increased frequency range(utrasound), improved visuals: longer distance, increased frequency range(radio waves) and new sensor modalities like magnetic fields for exmaple.

  • Better health and longer life
    Nanobots deployed in our blood circuit will monitor our live parameters 24x7 and send data through a wireless connection to a central computer system for analysis. They could help our immune system by detecting and kill viruses, cancer cells and baceterias.

  • Artificial Intelligence
    All lof the topics descirbed above has relations on one or other way with the ancient dream of the human race to create a thinking machine. Many beleives that this is a matter of computational power only. There are lot of phylosophical and scientific debates if the artifical intelligence idea is plausible , however I beleive that the singularity is near.
the possibilities are unlimited...


Modifying our bodies this way will generally change the way we sense the world. This will force us to create new behavioral traits , which eventually will be the only one product of the natural human evolution.


Links
Prof Kewin Warwickm, The first cyborg
Ray Kurzweil - my favourite futurist
KurzweilAI - web site dedicated to AI and related concepts
Mechanical Singularity
Evolutionary psychology
Nanobots
Technological Singularity
Augmented Virtual Reality

Sunday, September 9, 2007

Virtual Worlds and Human Imagination Boundaries

I’m not a scientist, but love to speculate about different matters, which are usually in the focus of the science. So if you feel like reading a pseudo-scientific-emotionally-loaded-speculative essay, then go on. If you’re true scientists and believe in the strict formal approach, the following text is not for you. Virtual Worlds (VW) are getting more and more popular these days. While there is not a broad acceptance of the virtual worlds as a communication platform yet, they will be definitely a part of our social life one day. These tools will eventually replace the Web 2.0 hype as a new content representation paradigm. People will than look on to the virtual worlds as an important social and business environment or even as new residential places much alike the Americas or Space. We will need to develop new skills in order to squeeze the most of this challenging environment. Futurists talk about the evolution of the human race in response to the space colonization. I think we’ll have yet another evolution path to optimize – virtual world colonization. This path has already started in the past, since the presence of APRANET. I’m talking about evolution path, because the virtual world properties will challenge our current biological limitations and their psychological reflection.
There are anthropological theories, focused over the psychological limitations developed in result of characteristics existing in our genotype and phenotype. We tend to develop specific psychological and cognitive constructs, seen as a direct result of our individual biological characteristics. People demonstrate behaviors, which are mapped to specific psychological constructs. Some of these behaviors may be easily seen as a direct result from specific biological characteristics. For the purpose of this speculation I will talk about them as biological limitations. Behaviors influenced by these biological limitations may be considered as a rational. We just can’t do it another way.

L'anatomie, c'est le destin!


The phrase means "the anatomy is a destiny" and is usually attributed to Fraud Freud. However FraudFreud him self said that this phrase belongs to Napoleon. We don’t jump from skyscrapers, because we can’t fly. We are afraid as the height goes up because of our flight limitations. We tend to approach a person closer to talk, because our voice is distance limited. We feel jealous, when our intimate partner is approached by another individual, because our reproductive path is treat hen. However, there are behaviors influenced by psychological constructs, where the relation to specific biological properties is hard to be seen. Sometimes the direct relation is missing just because the biological limitation is already a history. There are behaviors developed during the infancy (when there are biological limitations presented), which are used later in the individual’s life, when the limitation is not presented, anymore. Such behaviors may be seen as irrational. It is quite challenging from a scientific point of view to develop a clear behavior classification based on the rationale-irrational axis. The virtual worlds these days are still far more limited in terms of possible states (compared to the real world). The virtual worlds are still described by a limited number of rules, which allows researchers to operate over them with more formal tools. The formalization is possible due to the fact that they are computer simulated. This renders the virtual worlds as a useful scientific playground- the behavioral context is much simple and suitable for formal representations. However, these research possibilities raise a various ethic questions.
Because of the ideas mentioned up, I see the virtual world presence as a big evolution challenge for the human race. If one is roaming in the Second Life World, he may observe various behaviors, which may be mapped to a psychological constructs, while the biological limitation is not present in the virtual world. In the real life we shall act this way, just because we are biologically limited, but in the virtual world this behaviors may be classified as irrational. I will try to give some examples:

  • Line of Sight
    Tons of public events (concerts, theaters, educational classes) are hosted every day in SL. It is not rare to see folks struggling to take a front place near the stage. In real life we’re doing this, because our effective sight and hearing distance is limited. This tendency may be observed even into individuals, who know that SL provides instruments to watch around the avatar to a far greater distance.
  • Psychological Space Distribution
    In real life, we tend to distribute our physical space according to our psychological space. It may be observed easily in closed spaces like rooms. We stand closely to our intimate partners. We stand on a limited distance from unknown people and far from our enemies. We tend to keep a descent distance from formal peers during conversation. We take a place near the corner in unknown and unstructured situations. The distance and place are more likely to reflect our current emotional state, our fear and uncertainty, our goals. There are plenty of local social-cognitive theories, which try to explain the physical space distribution in result of specific psychological properties. Second Life provides few communication tools, which are distance limited and one which is not (IM). Go to public places in SL and watch people talking. You will see a tendency people to distribute the virtual space based on their feelings and goals toward specific peers. People try to preserve the right distance, when talking with peers. People try to stand in front of the peer as it would be done in the real world. People stand aside, when feel uncertain or socially incapable. Avatars may communicate in SL (technically) without using these common social techniques, but people use them.
  • 3D Space Distribution
    In real life we live closer to the ground. We have skyscrapers and towers, but they are built on the ground. Going higher and higher, the discomfort levels starts raising. While in real life we may localize specific biological limitations and corresponding psychological constructs, which make the sky inhabiting difficult, in SL it is slightly different. SL world is a true 3D (still virtual) space. Avatars (individuals) may fly up to 170 m (not sure about the exact height, actually), alone. The maximum height, to which avatar may fly with the help of a vehicle is about 4000m (there are ways to push avatar up to million meters, actually). Buildings do not have to be built on the ground and may hang up in the air up to 650 m from the ground. In general SL residents may live high above the ground if they wish. They may fully utilize the true 3D space in the virtual world – and it is still a dream in the real life in 21 century. There are a lot of buildings and platforms in SL built high above the ground. However the majority of the population inhabits the ground levels. I see it as a result of our specific psychological constructs formed in the physical world.

We may find plenty of examples about the way people tend to demonstrate behaviors used in the real world, which may be seen as rational in the physical world and irrational in the virtual world.

I’m not advocating that people do not have to behave like that in the virtual world. We are humans, when behave like humans. We have to stay closer to our intimate partner. We have to stand in front of our conversation peer. We have to feel joy, when observing beautiful pictures. We may live up high and feel fear. We must hunt for emotions. We must live for the moments, when the emotion stops our breath. However, the virtual worlds will change us. We will be limited by our imagination only.

This is the time to test if human imagination has boundaries!

Saturday, September 8, 2007

SL Scripter/Builder: How to rotate an object to a target direction, when the object has STATUS_PHYSICS

If you're going to be a scripter in SL , soon or later you will crash into a bunch of problems, when trying to rotate objects/prims.
A good start for every scripters in the jungle of rotations, quaternions and vectors and other 3D space related manupulations are the following resources:
http://wiki.secondlife.com/wiki/Rotation
http://www.cprogramming.com/tutorial/3d/quaternions.html
http://www.euclideanspace.com/maths/index.htm


I had a problem like this lately. I'm working on a new product called Star Glider - a kind of stylish flying vehicle and needed to implement a feature called auto-landing.
During the auto-land sequence the vehicle needed to rotate its self(around the Y axis) and become parallel to the ground. The idea behind this is to avoid landing on the vehicle nose or tail, when the pilot approaches the ground.



So, here comes the problem - How to rotate an object to a target direction.
In the problem given up, I will consider (for simplicity) that the ground rotation is equal to zero. Looking up the image:

  • V1 represents a ZERO vector <0,0,0> - a vector with ZERO angle relative to the Y axis. It is our target vector (note the image is not exact from a mathematical point of view)
  • V2 represents the current direction of the object in the Y-Z plane

What we need is to calculate the angle between the two vectors , generate a rotation quaternion based on the angle and apply the quaternion over the object.
The following script may be used:

//getting current rotation of the object(global rotation of the root prim)
rotation currentRot = llGetRot();
//getting vector V2 - the direction in the Y-Z plane
vector v1 = llRot2Left(currentRot);
vector v2 =<0,0,0>;
//generate rotation based on the angle between two vectors
rotation targetRot = llRotBetween(v1,v2);
//apply rotation
llRotLookAt(targetRot,1,1);

If you have a situation like in the image, the object will rotate untill becomes parallel with the ground - e.g. untiul the angle become ZERO;


NOTE:
Instead of llRotLookAt one may try to use llSetRot. However llSetRot does not work over physical prims (STATUS_PHYSICS).

Links:
http://wiki.secondlife.com/wiki/Rotation
http://www.cprogramming.com/tutorial/3d/quaternions.html
http://www.euclideanspace.com/maths/index.htm