Export of internal Abseil changes
-- 074a799119ac881b8b8ce59ef7a3166d1aa025ac by Tom Manshreck <shreck@google.com>: nit: Add return info for StrCat PiperOrigin-RevId: 278647298 -- d58a2a39ab6f50266cc695506ba2e86bdb45d795 by Mark Barolak <mbar@google.com>: Stop suppressing no-nested-anon-types warnings because there aren't actually any warnings to suppress. PiperOrigin-RevId: 278440548 -- 445051bd280b9a6f608a8c80b3d7cafcc1377a03 by Abseil Team <absl-team@google.com>: ResetThreadIdentity does not need to clear identity->waiter_state. ResetThreadIdentity is only called by NewThreadIdentity. NewThreadIdentity is only called by CreateThreadIdentity. CreateThreadIdentity calls PerThreadSem::Init, which initializes identity->waiter_state, immediately after calling NewThreadIdentity. Therefore ResetThreadIdentity does not need to clear identity->waiter_state. PiperOrigin-RevId: 278429844 -- c2079b664d92be40d5e365abcca4e9b3505a75a6 by Abseil Team <absl-team@google.com>: Delete the f->header.magic check in LowLevelAlloc::Free(). The f->header.magic check in LowLevelAlloc::Free() is redundant, because AddToFreeList() will immediately perform the same check. Also fix a typo in the comment that documents the lock requirements for Next(). The comment should say "L >= arena->mu", which is equivalent to EXCLUSIVE_LOCKS_REQUIRED(arena->mu). NOTE: LowLevelAlloc::Free() performs the f->header.magic check without holding the arena lock. This may have caused the TSAN data race warning reported in bug 143697235. PiperOrigin-RevId: 278414140 -- 5534f35ce677165700117d868f51607ed1f0d73b by Greg Falcon <gfalcon@google.com>: Add an internal (unsupported) PiecewiseCombiner class to allow hashing buffers piecewise. PiperOrigin-RevId: 278388902 GitOrigin-RevId: 074a799119ac881b8b8ce59ef7a3166d1aa025ac Change-Id: I61734850cbbb01c7585e8c736a5bb56e416512a8
This commit is contained in:
parent
20de2db748
commit
e96ae2203b
11 changed files with 278 additions and 15 deletions
|
@ -447,7 +447,7 @@ static inline uintptr_t RoundUp(uintptr_t addr, uintptr_t align) {
|
|||
// that the freelist is in the correct order, that it
|
||||
// consists of regions marked "unallocated", and that no two regions
|
||||
// are adjacent in memory (they should have been coalesced).
|
||||
// L < arena->mu
|
||||
// L >= arena->mu
|
||||
static AllocList *Next(int i, AllocList *prev, LowLevelAlloc::Arena *arena) {
|
||||
ABSL_RAW_CHECK(i < prev->levels, "too few levels in Next()");
|
||||
AllocList *next = prev->next[i];
|
||||
|
@ -508,8 +508,6 @@ void LowLevelAlloc::Free(void *v) {
|
|||
if (v != nullptr) {
|
||||
AllocList *f = reinterpret_cast<AllocList *>(
|
||||
reinterpret_cast<char *>(v) - sizeof (f->header));
|
||||
ABSL_RAW_CHECK(f->header.magic == Magic(kMagicAllocated, &f->header),
|
||||
"bad magic number in Free()");
|
||||
LowLevelAlloc::Arena *arena = f->header.arena;
|
||||
ArenaLock section(arena);
|
||||
AddToFreelist(v, arena);
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue