• Fur Affinity Forums are governed by Fur Affinity's Rules and Policies. Links and additional information can be accessed in the Site Information Forum.

Open Chat

Comfy-Girl

The cutest plushie bunny in a lil green sweater
me:
All the memes are for lazy introverted boys with no confidence. I'm the exact opposite.

friend:
We have K-pop.

me:
,BUT I'M STREET!
 

Hound-of-chulainn

Well-Known Member
I gave my cat a bath and she knocked my headset in the tub. :(
But my headset stopped me from getting a claw in my shoulder so all is well. xD
BUT SHE CLEAN AND NOW MAYBE MY ALLERGIES WILL LESSEN FOR A WEEK. :D
 

Fallowfox

Are we moomin, or are we dancer?
I am being super dumb today, but can anybody help me?

I want to have two variables, x and y, and a dependent variable z.

I want the combination of x and y to be predictive of z, but neither to be predictive of z on their own.

Should I just create a random vector of numbers, call that z, and that take another random vector of numbers (call that x) and subtract it to arrive at a residual and call that residual y?


Edit: this is not the way to do it; y is related to z very clearly.
 

contemplationistwolf

Aspirational AI Engineer Wolf
I am being super dumb today, but can anybody help me?

I want to have two variables, x and y, and a dependent variable z.

I want the combination of x and y to be predictive of z, but neither to be predictive of z on their own.

Should I just create a random vector of numbers, call that z, and that take another random vector of numbers (call that x) and subtract it to arrive at a residual and call that residual y?


Edit: this is not the way to do it; y is related to z very clearly.
We can just pick both X and Y to be independent and take on integer values from 0 to N-1 with equal probability (1/N), with N just being an arbitrarily chosen positive integer. Then, we can just choose that Z = (X + Y) Mod N
Here Mod is just the remainder from division by N.

With that choice, no matter what value X takes on, Z can still take on any value from 0 to N-1 equiprobably. Same applies for Y, given that the other is unknown. With both X and Y known though, Z is uniquely determined.
 

Fallowfox

Are we moomin, or are we dancer?
I am not sure whether I should respond to contemplationistwolf's attempt to help, because I appear to be on his block list.

I solved the problem in an inelegant way yesterday.

I defined a randomly distributed vector z
z = normal distribution (mean=0, standard deviation=1)
I defined a randomly distributed vector x
x = normal distribution (mean=0, standard deviation=1)
I then defined a suppressor variable as the difference between the two, with added noise
y=z-x + normal distribution (mean=0, standard deviation=1)
and then I added additional noise as I saw fit
y= y + normal distribution (mean=0, standard deviation=1)

and I re-ran the code until a random set was generated for which z does not depend on x or y, but *does* depend on their combination.
 

contemplationistwolf

Aspirational AI Engineer Wolf
I am not sure whether I should respond to contemplationistwolf's attempt to help, because I appear to be on his block list.

I solved the problem in an inelegant way yesterday.

I defined a randomly distributed vector z
z = normal distribution (mean=0, standard deviation=1)
I defined a randomly distributed vector x
x = normal distribution (mean=0, standard deviation=1)
I then defined a suppressor variable as the difference between the two, with added noise
y=z-x + normal distribution (mean=0, standard deviation=1)
and then I added additional noise as I saw fit
y= y + normal distribution (mean=0, standard deviation=1)

and I re-ran the code until a random set was generated for which z does not depend on x or y, but *does* depend on their combination.
You are not on my block/ignore list, I don't have anyone on that list anymore, and block lists were done away with anyways. These days we just have one-sided ignore lists, where one party just won't see what the other posts, while the other party sees everything the first posts and can interact with them normally. The forum doesn't in any way notify if you are on a person's ignore list.

Given our unfortunate history of spats, I'm probably on your ignore list. You however are perfectly free to respond to me if you wish, though if you wish me to not address you at all then we can do that.
 

Fallowfox

Are we moomin, or are we dancer?
You are not on my block/ignore list, I don't have anyone on that list anymore, and block lists were done away with anyways. These days we just have one-sided ignore lists, where one party just won't see what the other posts, while the other party sees everything the first posts and can interact with them normally. The forum doesn't in any way notify if you are on a person's ignore list.

Given our unfortunate history of spats, I'm probably on your ignore list. You however are perfectly free to respond to me if you wish, though if you wish me to not address you at all then we can do that.

There was a forum update a while ago, which meant that block-lists became 'symmetric'.
So you must have been added to my block-list automatically at some point in the past without me realising.

Anyway, I have produced the vectors with the properties I wanted, but I am now wondering whether I actually needed them for the idea I was trying to test.
 

contemplationistwolf

Aspirational AI Engineer Wolf
There was a forum update a while ago, which meant that block-lists became 'symmetric'.
So you must have been added to my block-list automatically at some point in the past without me realising.

Anyway, I have produced the vectors with the properties I wanted, but I am now wondering whether I actually needed them for the idea I was trying to test.
Well, glad you figured out a suitable solution. Hey, why not share what idea you were trying to test, if it's not confidential or anything? From the looks of things, you do work on interesting things.
 

Fallowfox

Are we moomin, or are we dancer?
Well, glad you figured out a suitable solution. Hey, why not share what idea you were trying to test, if it's not confidential or anything? From the looks of things, you do work on interesting things.


Let's say I have a matrix of data about the animals I have observed in a wood.
I also have a matrix of data about the weather conditions at the time I observed the animals, and the types of plants I saw.
I have a hypothesis that the occurrence of different animals might depend on both weather conditions and the types of plants available.

so I was contriving a scenario where some pretend properties of the weather and plants didn't individually explain animal occurrence, but combine to explain it well.
In order so that I could produce a statistical work-flow to apply to real datasets that could detect these sorts of features.
 

Fallowfox

Are we moomin, or are we dancer?
Let's say I have a matrix of data about the animals I have observed in a wood.
I also have a matrix of data about the weather conditions at the time I observed the animals, and the types of plants I saw.
I have a hypothesis that the occurrence of different animals might depend on both weather conditions and the types of plants available.

so I was contriving a scenario where some pretend properties of the weather and plants didn't individually explain animal occurrence, but combine to explain it well.
In order so that I could produce a statistical work-flow to apply to real datasets that could detect these sorts of features.

It's not really looking like there is a nice straightforward answer to this.
 

contemplationistwolf

Aspirational AI Engineer Wolf
Let's say I have a matrix of data about the animals I have observed in a wood.
I also have a matrix of data about the weather conditions at the time I observed the animals, and the types of plants I saw.
I have a hypothesis that the occurrence of different animals might depend on both weather conditions and the types of plants available.

so I was contriving a scenario where some pretend properties of the weather and plants didn't individually explain animal occurrence, but combine to explain it well.
In order so that I could produce a statistical work-flow to apply to real datasets that could detect these sorts of features.
Sounds not too dissimilar from the kind of project I myself had to do not too long ago. Basically, I had to take robot sensor data and the computations from our neural networks to construct a feature that helps us detect specific rare cases our robots ran into. The feature had to allow for a high recall and precision classifier to be constructed while also being possible to compute fast on a large quantity of historic robot data.

Anyhow, wish you luck figuring out the most appropriate workflow for that task! From what I gather of the task, I'd personally approach it by taking appropriate subsamples of the real datasets, like specific forest(s) and specific timespans, and then just experiment with various features that intuitively seem like they should be predictive, specific functions of plant presence and weather conditions. Then, I'd see how well the constructed features correlate with the occurrence of said animals, and I'd keep improving those features based on the results I see.
 

Judge Spear

Well-Known Member
20220112_174850.jpg
 

Fallowfox

Are we moomin, or are we dancer?
Sounds not too dissimilar from the kind of project I myself had to do not too long ago. Basically, I had to take robot sensor data and the computations from our neural networks to construct a feature that helps us detect specific rare cases our robots ran into. The feature had to allow for a high recall and precision classifier to be constructed while also being possible to compute fast on a large quantity of historic robot data.

Anyhow, wish you luck figuring out the most appropriate workflow for that task! From what I gather of the task, I'd personally approach it by taking appropriate subsamples of the real datasets, like specific forest(s) and specific timespans, and then just experiment with various features that intuitively seem like they should be predictive, specific functions of plant presence and weather conditions. Then, I'd see how well the constructed features correlate with the occurrence of said animals, and I'd keep improving those features based on the results I see.

Fortunately the inferential methods I am using are all linear statistics. Neural network approaches are beyond me.
 

contemplationistwolf

Aspirational AI Engineer Wolf
Fortunately the inferential methods I am using are all linear statistics. Neural network approaches are beyond me.
Thankfully I didn't have to build them on my own. I just used the results of what the ones in our robots computed. I'm not an expert in them by any means, not my precise specialty, though I have studied them a bit.

I solved the problem in an inelegant way yesterday.

I defined a randomly distributed vector z
z = normal distribution (mean=0, standard deviation=1)
I defined a randomly distributed vector x
x = normal distribution (mean=0, standard deviation=1)
I then defined a suppressor variable as the difference between the two, with added noise
y=z-x + normal distribution (mean=0, standard deviation=1)
and then I added additional noise as I saw fit
y= y + normal distribution (mean=0, standard deviation=1)

and I re-ran the code until a random set was generated for which z does not depend on x or y, but *does* depend on their combination.
It's probably not gonna be useful anymore as you already generated the sets you wanted, but I think here's how you would get exactly what you wanted:

x = uniform distribution (min = 0.0, max = 1.0)
y = uniform distribution (min = 0.0, max = 1.0)
z = fmod(x+y, divisor = 1.0)

Here fmod is the floating point remainder by division function. I assume R is the language you are using, so here is its reference: https://www.rdocumentation.org/packages/RPMG/versions/2.2-3/topics/fmod
 

Fallowfox

Are we moomin, or are we dancer?
Thankfully I didn't have to build them on my own. I just used the results of what the ones in our robots computed. I'm not an expert in them by any means, not my precise specialty, though I have studied them a bit.


It's probably not gonna be useful anymore as you already generated the sets you wanted, but I think here's how you would get exactly what you wanted:

x = uniform distribution (min = 0.0, max = 1.0)
y = uniform distribution (min = 0.0, max = 1.0)
z = fmod(x+y, divisor = 1.0)

Here fmod is the floating point remainder by division function. I assume R is the language you are using, so here is its reference: https://www.rdocumentation.org/packages/RPMG/versions/2.2-3/topics/fmod

Thankyou for your advice. You are right that R is my favourite language.

I have decided to use a matrix of real data decomposed into 'mutually orthogonal' components as the 'dependent' matrix, so that it contains real complexities such as structured non-uniform distributions.
and then I have produced synthetic independent matrices, which are each structured by different independent components of the dependent matrix.

So I know that multiple independent matrices should better constrain variance in the dependent matrix than any single independent matrix.
 
Top