Objects array - display.remove() ----- INTERESTING BEHAVIOUR

Let’s say we have an array of three elements:

local c = {}
for i=1, 3 do
    c[i] = display.newCircle( 0, 0, 1)
end

-- then, I remove the 2nd element:
display.remove( c[2] )
c[2] = nil

-- Now I print the number of elements
print ("There are "..#c.." elements")

The result in the console is 1. Why?
It should still be 3, since the element is located in an intermediate position, leaving a hole in the array(table)

Hi. This is unrelated to display.remove(), but only to your table.

In Lua the # operator (see docs) only guarantees that it will give you one of the possible positions that comes before a nil. This will just work if you don’t poke holes in the middle, but otherwise might do what you see above.

1 Like

The length operator starts iterating from the start of the table and returns how many entries it finds before the first nil value.

Edit: seems that @StarCrunch was a few seconds faster. :smile:

1 Like

Than you @StarCrunch … I need to study a little more :grinning_face_with_smiling_eyes:

Thank you @XeduR , worth the clarification

Can anyone explain this:

local c = {}
for i=1, 3 do
    c[i] = i
end
c[2] = nil
print ("There are "..#c.." elements") --> result: 1
local a = {1,2,3}
a[2] = nil
print ("There are "..#a.." elements") --> result: 3

What on earth is the difference between those two examples?
:exploding_head:

You can read something like this and other similar Q&As for details since Solar isn’t unique when it comes to Lua and undefined behavior.

Without the details, there’s this explanation from the 5.1 manual that StarCrunch has linked above for example — it really is “any of the indices that…”, so it isn’t about the actual differences between your examples:

The length of a table t is defined to be any integer index n such that t[n] is not nil and t[n+1] is nil ; moreover, if t[1] is nil , n can be zero. For a regular array, with non-nil values from 1 to a given n , its length is exactly that n , the index of its last value. If the array has “holes” (that is, nil values between other non-nil values), then #t can be any of the indices that directly precedes a nil value (that is, it may consider any such nil value as the end of the array).

In my example, the initial tables c and a are identical ({1,2,3}). Then I nillify their second element, so they both should be {1, nil, 3}. However, #c=1 while #a=3. Am I missing something?

Maybe the 2nd bizarre thing in Lua? (First being array index starts at 1 and not 0)

Per Lua 5.1 Manual

The length of a table t is defined to be any integer index n such that t[n] is not nil and t[n+1] is nil ; moreover, if t[1] is nil , n can be zero. For a regular array, with non-nil values from 1 to a given n , its length is exactly that n , the index of its last value. If the array has “holes” (that is, nil values between other non-nil values), then #t can be any of the indices that directly precedes a nil value (that is, it may consider any such nil value as the end of the array).

Notice the “can be”, which means not necessarily. :thinking:

And from Lua 5.2 Manual (I don’t think how # works changed between 5.1 and 5.2, just explains it in a different (simpler?) way)

Unless a __len metamethod is given, the length of a table t is only defined if the table is a sequence, that is, the set of its positive numeric keys is equal to {1…n} for some non-negative integer n. In that case, n is its length. Note that a table like

{10, 20, nil, 40}

is not a sequence, because it has the key 4 but does not have the key 3. (So, there is no n such that the set {1…n} is equal to the set of positive numeric keys of that table.) Note, however, that non-numeric keys do not interfere with whether a table is a sequence.

Short answer:

The # operator returns {1…n} , which is only set if the table is a sequence (or a __len metamethod is given). Performing a length check on a table that is not sequenced results in an “unpredictable” value… meaning, the result could be accurate or not, and I read somewhere else that the returned value of said tables can even be 0, -1, or false (not sure which Lua version though).

More insight:

In the examples you provided, removing index 2 using

table.remove(a,2)

assures data stays sequenced as it will shift entries so that there is no nil hole; this is actually the “overhead” sometimes we don’t want and avoid by NOT using table.remove…. but it does come in handy.

For the low-level details, go down to OP_LEN in luaV_execute, in particular the table case. You can click on luaH_getn to see the actual algorithm.

I could be remembering this wrong, but it has to do with how the tables are initialised.

local c = {}
for i=1, 3 do
    c[i] = i
end
c[2] = nil
print ("There are "..#c.." elements") --> result: 1

local a = {1,2,3}
a[2] = nil
print ("There are "..#a.." elements") --> result: 3
a[4] = 4
a[4] = nil
print ("There is now "..#a.." element") --> result: 1

When you initialise a table with a predetermined size, it’ll perform slightly better than a table that is empty to begin with, because it is expected to have a predetermined size (hence why it also returns 3). It should hold on to that until the table expands the first time after its initialisation. After that initial expansion, the table size is no longer expected to remain the same and now the length operator works as expected.

1 Like

To follow up, I did a quick and simple test and benchmarked these two functions:

local function a()
    local t = {true, true, true, true}
    t[1] = 1
    t[2] = 2
    t[3] = 3
    t[4] = 4
end

local function b()
    local t = {}
    t[1] = 1
    t[2] = 2
    t[3] = 3
    t[4] = 4
end

And a tends to be around twice as fast. There should also be a memory related performance boost. Still, this purely negligible performance improvement in 99.99% of instances as running 100 000 iterations of a took around ~30ms whereas b took around ~60ms. The absolute difference isn’t much.

I think Lua tables could be considered to be something of weak tuples upon initialisation (if they only have numeric indices).

Thanks Siu. I guess then there is no meaning in trying to rationalize an unpredictable behaviour.
It’s true that you can shift the entries after nil using table.remove() but this method may not always serve your purpose. I generally avoid leaving holes in tables. There are other workarounds.

Aah…that explains everything! :+1: Good catch!

That’s a very interesting observation, thanks for sharing! The 30ms difference for a single table is significant, imagine having hundreds of tables.

Yeah, but given that it was 30ms over 100 000 iterations, we are talking about an absolute difference of 0.0003ms per function call. It won’t make a difference for most uses cases.

Right, it’s still good to know though. For a complex app every optimization is welcome.