News:

For advertising inquiries or help with registration or other issues, you may contact us by email at help@harleytechtalk.com

Main Menu

Narrow Band Then Wide Band Tuning -Dynojet Power Vision

Started by Sporty 48, October 27, 2011, 08:28:40 AM

Previous topic - Next topic

0 Members and 4 Guests are viewing this topic.

whittlebeast

#100
I wish it was just 3%.  I could work with that number.  About three percent is what I have seen on the BTs.

At highway speeds, the data logs on the Sportys look very different  than they do on the BT motors.  Feel free to take each for a ride and then plot pulse width.

Beast

Dynos are great for getting the motor close enough to get on the data loggers.

glens

Quote from: hpulven on November 01, 2011, 06:40:32 AM
Quote from: glens on November 01, 2011, 04:53:06 AM
If 90% of the hits are on the lean swing, all that means is that 90% of the hits that were captured were on the lean swing.  It doesn't mean that any more than half the swings the engine uses were lean.

This is a point that interests me;
On my twincam the PV logs show another interesting tendency, when the clb's are at 700 mV, the sensor voltages average (count or integral) about 840 mV, and it looks like the sensors are doing crossovers  about once every 5th second.
With clb at around 500, it looks like the sensors are doing crossovers about once every second (like they are supposed to do). Until spring and further testing I can only speculate around several possibilities:
-Are the NB sensors that much slower at a higher clb?
-Is the ECM dataoutput the reason? Why is there a systematic skew one way or the other?
-Is the datacollecting on the PV the problem? Again, why is there a systematic skew?

What do people using TTS see when they log sensor voltage at higher clb's, what is the mean time between crossovers compared to at about 450 mV?

Okay, look at it this way.  At 3000 RPM you're talking a frequency of 50 Hertz per cylinder's four-stroke cycle.  Half that number will get you the frequency of the exhaust strokes, during which, say, the O2 sensor is polled, so 25 times per second at 3000 RPM.  Inverting 25 cycles per second gets you 0.04 seconds per cycle.  The O2 sensor is getting read every 40 milliseconds at 3000 RPM.  I believe that's about the average delta between successive "readings" taken by the Powervision; 40 mS.   The TTS averages roughly 250 mS per "reading", depending upon the data "package" requested.  That's about as fast as the ECM can deliver the data over the bus.  The PV continually replicates any data point every ~40 mS until such time as a new value for that data point is obtained, when the replication resumes.  The new data typically arrives every ~250 mS from the bike's data port.

So at 3000 RPM, you're looking at in your data log, on average 40/250 of the O2 sensor readings the ECM uses.  16% of the pertinent data is all you're seeing at 3000 RPM.  What do you suppose the chances are that you're getting a  "perfect" look at what's actually going on?  If the ECM fuels higher-than-CLB for a couple or three power strokes, then lower-than-CLB for a couple or three power strokes, what are the chances you're seeing a proportionate representation of that in the data logs?  I don't know.  I'm asking you what you think the chances are.

hpulven

Quote from: glens on November 01, 2011, 02:30:27 PM
So at 3000 RPM, you're looking at in your data log, on average 40/250 of the O2 sensor readings the ECM uses.  16% of the pertinent data is all you're seeing at 3000 RPM.  What do you suppose the chances are that you're getting a  "perfect" look at what's actually going on?  If the ECM fuels higher-than-CLB for a couple or three power strokes, then lower-than-CLB for a couple or three power strokes, what are the chances you're seeing a proportionate representation of that in the data logs?  I don't know.  I'm asking you what you think the chances are.
Thank you for your answer, these are indeed numbers to keep in mind trying to interpret logs.
To answer your question I naturally think the chances of getting a "perfect" look are about zero. What I would expect is that we would get far fewer crossovers in the logs than what is happening in reality, as the reality is indeed undersampled. Further I would expect runs where higher voltages are overrepresented and runs where lower voltages are overrepresented.
But, in a long log run, we are sampling from a reality where the voltages are supposed to be under the chosen bias about 50% of the time and over about 50% of the time, so from a statistical point of view the global average should be near the chosen bias voltage. So I would in fact expect a proportional representation of each side of the bias in the log as a whole. This is what puzzles me and the reason I asked my questions, I really don't understand the systematic skew of the data,
the somewhat peculiar habit of PV to repeat the last value is not enough to explain this.
As I am not able to test this with my newly bought TTS until my bike is out from winter storage I am curious to know if the TTS way of logging gives a more proportional representation of the O2 sensor voltage? That would be helpful in eliminating some possibilities.

TXP

Without lab quality sensors, you'll never get there. Perfect is as elusive as a pot of gold at the end of the rainbow. From my experience, their is currently no more accurrate product out there than TTS for the Delphi systems. Like it or not Steve is still at the lead of the pack IMHO. Others are working to make their products more user friendly. I recently had a very good discussion with the SESPT programing guru and he is not static. I expect we will see that product move in a positive direction in the not too distant future. In this business you continually move forward or get left in the dust.

glens

I don't think we can expect anything even approaching a proportionate view of what the O2 sensors are saying.  We might get mostly the high swings one time, mostly the low swings another.  All we really can expect is to catch the fact that the sensors are indeed switching due to the injector activity.  Just like all we can expect is to catch the fact that the injectors are fueling some low and some high.  To take the data we have and attempt to derive exactly how much time the injectors are spending open vs. how much time they're spending closed, or how much time the O2 sensors are spending above CLB vs. below CLB, is just plain foolish.

In direct answer to your question, I've seen O2 traces which usually indicate a shorter and broader plateau above CLB, with a deeper and narrower valley below CLB at such times as the CLB is set toward the higher end.  When CLB is closer to the midpoint the traces are more symmetrical.  But all I ever take away from any of it is that they are indeed switching.

If you held a steady engine speed and throttle position (if those are the axes of your VE tables; RPM/MAP otherwise) long enough, and at the center of some VE table cell, then you likely could trust the "VE new" and/or "AFF" that you see.  I'm assuming the Powervision "AFF" is derived by "VE new"/"VE".  But even if you could trust it, I still don't think it would be informative in any usable way regarding repopulating your VE tables.  You would only be able to do that if you knew where the AFV cells were located in your calibration, and what the surrounding AFV cells contained.  Since they're averaged together proportionately, they might not really indicate what the best VE value would be to place in any given VE cell.

Another aspect of this would be that you'd have to know what the center of a VE cell really pertained to.  If there was a cell at, say, 2500 RPM vs. 50% throttle, is the center on the RPM axis 2500 or does that value pertain to the cell boundary nearer the smaller RPM value.  Likewise, does the 50% throttle indicate the center of that column or the boundary adjoining the next smaller value?  Do you understand what I'm saying?  It'd make a difference, for sure.

As well, you'd have to decide if the physical center of the cell would be the best place to find a value for that cell, or whether the best statistical sample of all that the cell covers would be the best value to use; the two are likely different.  I don't feel there's enough data available out of the system bus to determine such stuff by merely perusing log files.  You'd need intimate knowledge that only a few folks possess in order to accurately put the data to use.

The best thing to do would be to grab all the sensor and component signals directly, at full speed, and interleave that data with what you can discover from the ECM, or compare it directly to the calibration tables.  Only then could you use the information to derive a better state of tune.

whittlebeast

If you flip a coin 10000 times you will get real close to 5000 heads or 50%

If you only count every tenth flip you will count 1000 flips and get real close to 500 heads or 50%

As long as the number of samples is high, nearly nothing changes in the big picture.

Beast
Dynos are great for getting the motor close enough to get on the data loggers.

glens

True enough if you're tossing coins.  What's the likelihood of getting several hundred of something in a row? 

Maybe we should refer to this as "The Coin Toss" method of tuning.

mayor

I actually agree with Andy's logic on this, and it's a SPC (statistical process control) method of veiwing things. This was along the lines of what I was thinking in relation to the slower broad bands. You might not be able to get a reading every fire, but there's still plenty of fires being recorded. 
warning, this poster suffers from bizarre delusions

Coyote

#108
Tossing coins is not a good analogy for tuning. I've spent my life in the controls, closed loop, digital sampling industry. Once you lose data, you cannot recover it from averaging. There is a reason that you must sample data much faster than than the data itself. A nice theorem to look up is the Nyquist  theorem. It explains the minimum sampling rate for analog signals. And while I'm the first to admit I don't know crap about tuning, this is just basic engineering. Andy's logic has no basis here (certainly not on a moving target). If you hold steady for sometime, then maybe.

Scotty

Just something I read on REAL random events.

Random doesn’t look so random!
Most people find long series of consecutive results much less likely than they really are. A string of four heads starts looking like a pattern when in fact, there isn‟t one. A string of eight or nine will have a crowd looking for any cause other than random chance but in reality, there is a very good chance that at least one of these long runs of consecutive results will appear if you continue flipping a coin long enough. In fact, you are almost guaranteed to see at least one series of six or more consecutive heads or tails in one hundred coin flips. This is why fake results are so easy to spot. We find this very surprising at first because we expect the outcome of consecutive coin tosses to alternate between heads and tails much more than they really do. We easily mistake long series of consecutive results, or clusters, as patterns when they are in fact truly random. To the untrained eye, this makes random results appear to be 'fake', and fake results seem more 'real'.

TXP

When you are reading data from the ECM, and its data that is actually NOT being updated as you read it, and the target is moving, how can anyone possibly think you are getting perfect data, or a perfect tune. Very few of us in the field will ever even see the type of equipment it takes to capture the data in question in real time. So basically aren't we all just using what we have and doing the best we can with it. Chasing perfect data without the correct "lab quality" equipment just isn't realistic. Not throwing stones at anyone, but the laws of physics have not been repealed,,,at least as far as I know.  :hyst:

whittlebeast

Most of my early tuning on the Harleys is done in closed loop anyway using the CLI, O2 Integrator or VE New vs VE.  All of those are filtered anyway by the "sacred" ECU anyway.  When you look at several rides of 20000 to 30000 samples, fatterns start jumping out at you.  Even 6 apparent random hits in a row normally would not get may attention.  When you see 6 high then 6 low then 5 high then 7 low.....  That pattern would get my attention.   If you saw that pattern ever time, then you have something.

Beast
Dynos are great for getting the motor close enough to get on the data loggers.

yositime

#112
If you see a pattern or not, the observation is simply an artifact of the sampling method and/or timing.  You probably need a good book and a 10 week undergraduate course to gain a better technical understanding but it is what it is. We see this often from the younger fellas in radio or satellite communication practice when they try to use improper sampling techniques or try to draw conclusions from the bastardized data (extracting data from what appears to be random samples). I’m sure it happens anywhere you try to predict outcomes based on insufficient data.  It is not that the observation is useless, but the analysis usually gives you the wrong result which more often than not leads you down a wrong path, making unsupported claims, along with misleading you into believing you are doing things more precisely than you are.

Happens all the time, and sometimes a gem is discovered by accident...  but the chances of that is less than hitting the powerball jackpot :).


If you are trying to hit a moving target where your control system only has the capability to home in +/- 3% of ideal and it really doesn’t matter if you miss the target by 6%, seems to me that trying to adjust the target within 0.1% of ideal may be an academic exercise... but that's just my opinion and what do  I know. 

glens

Andy, we know how the CLB is derived when the same value is used throughout.  Tell us how any VE, VE new, or O2 integrator we might see in a log are derived.  Then tell us how they should be applied and why.  Don't be timid.  Give us something to chew on.

1FSTRK

Quote from: whittlebeast on November 02, 2011, 03:55:21 AM
Most of my early tuning on the Harleys is done in closed loop anyway using the CLI, O2 Integrator or VE New vs VE.  All of those are filtered anyway by the "sacred" ECU anyway.  When you look at several rides of 20000 to 30000 samples, fatterns start jumping out at you.  Even 6 apparent random hits in a row normally would not get may attention.  When you see 6 high then 6 low then 5 high then 7 low.....  That pattern would get my attention.   If you saw that pattern ever time, then you have something.

Beast

I have read this entire thread and I too see the pattern
:gob: :gob: :missed: :missed: :missed: :banghead: :banghead: :gob: :gob: :missed: :missed: :missed: :banghead: :banghead:
"Never hang on to a mistake just because you spent time or money making it."

hpulven

Some interesting input here:
Quote from: glens
To take the data we have and attempt to derive exactly how much time the injectors are spending open vs. how much time they're spending closed, or how much time the O2 sensors are spending above CLB vs. below CLB, is just plain foolish.
Yes, exactly is foolish, but not trying to find an explanation for why these patterns are skewed. It is not foolish to ask why they are skewed if you want to understand what is going on.
Quote from: glens
In direct answer to your question, I've seen O2 traces which usually indicate a shorter and broader plateau above CLB, with a deeper and narrower valley below CLB at such times as the CLB is set toward the higher end.  When CLB is closer to the midpoint the traces are more symmetrical.  But all I ever take away from any of it is that they are indeed switching.
Thanks, that is what I see as well, so then it is a question of why the ECM is putting out data skewed. You are satisfied with seeing that they switch, I probably should be as well, but it would be interesting to know why.
(So far noone has answered this question ...)

Quote from: Coyote
Tossing coins is not a good analogy for tuning. I've spent my life in the controls, closed loop, digital sampling industry. Once you lose data, you cannot recover it from averaging. There is a reason that you must sample data much faster than than the data itself. A nice theorem to look up is the Nyquist  theorem. It explains the minimum sampling rate for analog signals

I agree in the context of tuning, or trying to reconstruct the analog signal, but noone is trying to do that, are they? I fail to understand what Nyquist has to do with it, unless you are trying to reconstruct the analog signal and playing it on your stereo equipment. There are lots of other theorems much more relevant in this context, taken from statistical sampling of discrete data, not from signal theory of analog signals. What this is about is finding and filtering representative data from an undersampled dataset. (Which is what the tuning software do.)
My question is just about a peculiarity in a simple test for randomness:
Why are the data skewed? As they are skewed, we can agree that we don't have a random sampling situation. That is what I am wondering about, what is the explanation for this nonrandom effect?

Quote from: TXP
So basically aren't we all just using what we have and doing the best we can with it. Chasing perfect data without the correct "lab quality" equipment just isn't realistic. Not throwing stones at anyone, but the laws of physics have not been repealed,,,at least as far as I know.
Exactly, so let us make sure we are not repealing statistical laws as well...
Even if the CL operation is not a random process, I would have thought that one hour of driving around would be enough to have a quite good pseudorandom situation, sampling a random variable defined as over or under the clb-bias, from a dataset where we know that this random variable should have a nearly 50/50 distribution would indeed be a tossing coin model. If this model is wrong there are several possibilities:
-The sampling is not random, but what is this nonramdom effect?
-The ECM is more busy when sensors show lean than when they show rich? If so, why?
-The sampling intervals are for some reason just the right multiple of the period of the o2 voltage signal, that would explain it, but totaly unrealistic.

I am still puzzled in my, perhaps foolish, hunt for some knowledge of a phenomena, which might be totally uninteresting and totally nonusable for tuning, but still...

whittlebeast

Hpulven

I would love to have you on board with the stuff I am seeing on the Yamaha world.  The data is amazing and totally different than what has been assumed to be gospel for years.  Things like the MAP at a 20% TPS sweep may be 94 kpa and a 100% TPS sweep may be 96 kpa.  You would expect the acceleration based on this, to be about the same.  But with the 100% sweep, the bike is accelerating 60% harder.  Is there a TPS based timing table?  Who knows....  But the the lesson is, keep looking at the data, there is a wealth of info in them.  You just have to be willing to look at the data.

Beast 
Dynos are great for getting the motor close enough to get on the data loggers.

Sporty 48

Looking  at a datalog reminds me of a scattergraph of coin tosses when the engine speed and manifold air pressure of a motorcycle ride are added to the mix. This is not about lost data it is about the data that we get. When a change is introduced to the system, say adjusting VE's, then the resulting change is reviewed in the data we get from the next datalog.
Very simple experimentation. Make a change, look at the results.
This is a simple infernal combustion engine. Gas and air in, fire, exhaust out, do it again.
Find ways to make it faster, better. Keep it simple.


Quote from: Coyote on November 01, 2011, 08:18:09 PM
Tossing coins is not a good analogy for tuning. I've spent my life in the controls, closed loop, digital sampling industry. Once you lose data, you cannot recover it from averaging. There is a reason that you must sample data much faster than than the data itself. A nice theorem to look up is the Nyquist  theorem. It explains the minimum sampling rate for analog signals. And while I'm the first to admit I don't know crap about tuning, this is just basic engineering. Andy's logic has no basis here (certainly not on a moving target). If you hold steady for sometime, then maybe.
A Sportster, Bird-dogs and an old Airstream, How Sweet It Is.

PC_Hater

Quote from: Sporty 48 on November 02, 2011, 09:45:43 AM
Looking  at a datalog reminds me of a scattergraph of coin tosses when the engine speed and manifold air pressure of a motorcycle ride are added to the mix. This is not about lost data it is about the data that we get. When a change is introduced to the system, say adjusting VE's, then the resulting change is reviewed in the data we get from the next datalog.
Very simple experimentation. Make a change, look at the results.
This is a simple infernal combustion engine. Gas and air in, fire, exhaust out, do it again.
Find ways to make it faster, better. Keep it simple.


Quote from: Coyote on November 01, 2011, 08:18:09 PM
Tossing coins is not a good analogy for tuning. I've spent my life in the controls, closed loop, digital sampling industry. Once you lose data, you cannot recover it from averaging. There is a reason that you must sample data much faster than than the data itself. A nice theorem to look up is the Nyquist  theorem. It explains the minimum sampling rate for analog signals. And while I'm the first to admit I don't know crap about tuning, this is just basic engineering. Andy's logic has no basis here (certainly not on a moving target). If you hold steady for sometime, then maybe.

BUT IT IS NOT SIMPLE!!!
YOU ARE WORKING WITH CENSORED DATA. THAT JUST MAKES TUNING EVEN MORE DIFFICULT.
the caps were deliberate. as in remember the basic engineering. well said Coyote, but there are ways around that, they just give non-PhDs a headache... And have a look at Bayesian Statistics while we are at it.
PC_Hater BSc MSc
1942 WLA45 chop, 1999 FLTR(not I), 2000 1200S

whittlebeast

Sometimes it's just not that difficult to pick up on the pattern.

Dynos are great for getting the motor close enough to get on the data loggers.

glens

Andy, answer the freakin' questions!  How is the AFF derived?  How is what is used to derive the AFF derived?  How is the VE it's working against derived?  What is the O2 integrator and how is it derived and used?

Unless and until you can give real answers to those questions, then using them for anything whatsoever is foolish.  Your time would be very much better served just riding the damn bike.  Do you ever even do that?

whittlebeast

Try asking Steve how the VE New was developed.

I have noticed that if you take

VE Front * (Front AFF/100) * (Front CLI/100) you get real close to VE New Front

In PowerVision you can turn on and off AFF and CLI

How would I know....  Try calling Delphi and see if they will give you the underlying math.  I just  try to make sense of the data as I comes out of the port.

Beast
Dynos are great for getting the motor close enough to get on the data loggers.

FLTRI

Quote from: whittlebeast on November 02, 2011, 11:03:56 AM
Sometimes it's just not that difficult to pick up on the pattern.


Andy,
Please take a couple moments to point the good, bad, and the ugly on your graphs like Mayor does so we can follow along.
You know, for us idiots who can't seem to see the obvious :scratch:
Thanks,
Bob
The best we've experienced is the best we know
Always keep eyes and mind open

glens

Quote from: hpulven on November 02, 2011, 04:57:32 AM
Yes, exactly is foolish, but not trying to find an explanation for why these patterns are skewed. It is not foolish to ask why they are skewed if you want to understand what is going on.
What's to understand about it?  It's not a situation similar to a coin toss, where there can only be one of two results.  Whenever you see a plotted point above the CLB, you do not know whether it was captured on the way up, at the peak, or on the way down.  The resolution just isn't there.  If you were able to ask the ECM to show only the minimum and maximum voltages of the O2 sensor, nothing at all of the rises and falls, then and only then might you be able to liken it to a coin toss.  But I'd bet it still wouldn't approach a 50/50 result over time like the coin toss eventually would.

If you could sample the data rapidly enough, you wouldn't need to see the specific high and low points.  You could reconstruct them if they'd got omitted.  But like I said, you don't know which direction the voltage is travelling at any time you get a snapshot of it.  It's effectively way beyond random.

Quote... it is a question of why the ECM is putting out data skewed. You are satisfied with seeing that they switch, I probably should be as well, but it would be interesting to know why.
(So far noone has answered this question ...)

I'm afraid that to get that answer you're probably going to have to get hold of either a Delphi or an H-D engineer.  Someone who's actually worked on the code and is willing to discuss it with you.  It could have been done deliberately if the EPA mandated designed-in anti-tampering measures.  It could just be the result of "we never intended such use of the data, so didn't spend the time working on providing it in any meaningful way".  The antiquated bus in use may be a large factor.

QuoteIf this model [50/50 distribution during CL] is wrong there are several possibilities:
-The sampling is not random, but what is this nonramdom effect?
-The ECM is more busy when sensors show lean than when they show rich? If so, why?
-The sampling intervals are for some reason just the right multiple of the period of the o2 voltage signal, that would explain it, but totaly unrealistic.

Unlike you, I feel that last possibility there is indeed the most likely of the bunch.  You've got a non-fixed frequency being sampled at a non-fixed frequency.  Remember, sending run data out the bus is not the primary duty of the ECM.  It's a relatively simple device built for one high-priority purpose.

QuoteI am still puzzled in my, perhaps foolish, hunt for some knowledge of a phenomena, which might be totally uninteresting and totally nonusable for tuning, but still...

I don't feel it's foolish of you to seek the answer to your question.  I feel it's foolish to try to fine-tune the running characteristics not knowing everything the ECM knows.  And for whatever reason, it's obviously not going to share all that with you.

whittlebeast

FLTRI

In that plot, apparently the AFF or Adaptive Fuel Factor was turned on.  In that screen sho,t I had the colors set to auto scale so keep in mind that the color scheme is different on the two plots.  Notice that above about 2800 RPM both plots are giving a 100 for the AFF so apparently I was open loop above that RPM at the time.  The other colors indicate what the average AFF was at each throttle position.  So in the plot on the left below about 5% throttle and below about 2100 RPM the ECU was adding about 5% fuel thru most of that area.

In the rear cylinder it was adding about 2% and over a fairly narrow band of RPM near 2000 and mostly below 5% throttle.  The rear cylinder was getting really close on that bike.

Hope this helps

Beast
Dynos are great for getting the motor close enough to get on the data loggers.