"At most logO(1)n" means that there is a constant c such that what is being measured is O(logcn).
In a more general context, f(n)∈logO(1)n is equivalent to the statement that there exists (possibly negative) constants a and b such that f(n)∈O(logan) and f(n)∈Ω(logbn).
It is easy to overlook the Ω(logbn) lower bound. In a setting where that would matter (which would be very uncommon if you're exclusively interested in studying asymptotic growth), you shouldn't have complete confidence that the author actually meant the lower bound, and would have to rely on the context to make sure.
The literal meaning of the notation logO(1)n is doing arithmetic on the family of functions, resulting in the family of all functions logg(n)n, where g(n)∈O(1). This works in pretty much the same as how multiplying O(g(n)) by h(n) results in O(g(n)h(n)), except that you get a result that isn't expressed so simply.
Since the details of the lower bound are in probably unfamiliar territory, it's worth looking at some counterexamples. Recall that any g(n)∈O(1) is bounded in magnitude; that there is a constant c such that for all sufficiently large n, |g(n)|<c.
g(n)<cg(n)>−c.
This means, contrary to more typical uses of big-oh notation, functions that decrease too rapidly can fail to be in logO(1)n; for example,
1n=log−(logn)/(loglogn)n∉logO(1)n
because
−lognloglogn∉O(1)
The exponent here grows in magnitude too rapidly to be bounded by
O(1).
A counterexample of a somewhat different sort is that −1∉logO(1)n.