I’m being stupid - you mentioned pairs() you’ve obviously got an indexed array, so iterating backwards was never going to be relevant.
When you mentioned next in the original post, I presume that’s where you’re now using k,v in pairs() ???
I’m being stupid - you mentioned pairs() you’ve obviously got an indexed array, so iterating backwards was never going to be relevant.
When you mentioned next in the original post, I presume that’s where you’re now using k,v in pairs() ???
Yep - I guess pairs() uses next() to do it’s iterating.
Is your timerStack indexed by number or by key?
By key.
Actually, what I’m doing is creating random unique alphanumeric keys and pairs()-ing through them, because I’m not really sure how to use indexing.
I think most people would use an integer index and use traditional for i = 1, tablesize do / end type constructs to manage those types of loops. Key-value pairs are good when you have a natural index that’s not a number, but it sounds like for what your doing the index doesn’t matter.
You can add things to the end of a table pretty easy using the # operator (gets table length):
mytable[#mytable+1] = value
Then to remove an entry, you can use table.remove… for instance to remove the 3rd entry: table.remove(mytable, 3).
And to iterate over the table (best for removing things is to go backwards):
for i = #mytable, 1 do
– do whatever
end
And indexing like that won’t have any *issues* (caused by the indexing, not the coder)?
The only thing is deleting things from the middle of the list and using the # operator to measure the length of the array. If you have an array:
x[1] = 10
x[2] = 15
x[3] = nil
x[4] = “Barney”
x[5] = “Wilma”
Then #x returns 2. The nil stops the counting. There are two solutions. One is to use table.maxn(x) to get the actual count of records and when iterating over it, skip the nil’s or two: table.remove(x, 3) to remove the entry. The later will copy 4 to 3, 5 to 4 to collapse out the hole, so it can be a bit time consuming if you have 100,000 times and you remove #4 for instance.
Of course if you use table.maxn() and you iterate over a list of 100,000 that 90% of them have been removed, that can be inefficient too. So a combination of the two, use maxn() and then periodically purge the nil’s and get a new maxn()