Skip to content

Commit

Permalink
Auto merge of #49669 - SimonSapin:global-alloc, r=alexcrichton
Browse files Browse the repository at this point in the history
Add GlobalAlloc trait + tweaks for initial stabilization

This is the outcome of discussion at the Rust All Hands in Berlin. The high-level goal is stabilizing sooner rather than later the ability to [change the global allocator](#27389), as well as allocating memory without abusing `Vec::with_capacity` + `mem::forget`.

Since we’re not ready to settle every detail of the `Alloc` trait for the purpose of collections that are generic over the allocator type (for example the possibility of a separate trait for deallocation only, and what that would look like exactly), we propose introducing separately **a new `GlobalAlloc` trait**, for use with the `#[global_allocator]` attribute.

We also propose a number of changes to existing APIs. They are batched in this one PR in order to minimize disruption to Nightly users.

The plan for initial stabilization is detailed in the tracking issue #49668.

CC @rust-lang/libs, @glandium

## Immediate breaking changes to unstable features

* For pointers to allocated memory, change the pointed type from `u8` to `Opaque`, a new public [extern type](#43467). Since extern types are not `Sized`, `<*mut _>::offset` cannot be used without first casting to another pointer type. (We hope that extern types can also be stabilized soon.)
* In the `Alloc` trait, change these pointers to `ptr::NonNull` and change the `AllocErr` type to a zero-size struct. This makes return types `Result<ptr::NonNull<Opaque>, AllocErr>` be pointer-sized.
* Instead of a new `Layout`, `realloc` takes only a new size (in addition to the pointer and old `Layout`). Changing the alignment is not supported with `realloc`.
* Change the return type of `Layout::from_size_align` from `Option<Self>` to `Result<Self, LayoutErr>`, with `LayoutErr` a new opaque struct.
* A `static` item registered as the global allocator with the `#[global_allocator]` **must now implement the new `GlobalAlloc` trait** instead of `Alloc`.

## Eventually-breaking changes to unstable features, with a deprecation period

* Rename the respective `heap` modules to `alloc` in the `core`, `alloc`, and `std` crates. (Yes, this does mean that `::alloc::alloc::Alloc::alloc` is a valid path to a trait method if you have `exetrn crate alloc;`)
* Rename the the `Heap` type to `Global`, since it is the entry point for what’s registered with `#[global_allocator]`.

Old names remain available for now, as deprecated `pub use` reexports.

## Backward-compatible changes

* Add a new [extern type](#43467) `Opaque`, for use in pointers to allocated memory.
* Add a new `GlobalAlloc` trait shown below. Unlike `Alloc`, it uses bare `*mut Opaque` without `NonNull` or `Result`. NULL in return values indicates an error (of unspecified nature). This is easier to implement on top of `malloc`-like APIs.
* Add impls of `GlobalAlloc` for both the `Global` and `System` types, in addition to existing impls of `Alloc`. This enables calling `GlobalAlloc` methods on the stable channel before `Alloc` is stable. Implementing two traits with identical method names can make some calls ambiguous, but most code is expected to have no more than one of the two traits in scope. Erroneous code like `use std::alloc::Global; #[global_allocator] static A: Global = Global;` (where `Global` is defined to call itself, causing infinite recursion) is not statically prevented by the type system, but we count on it being hard enough to do accidentally and easy enough to diagnose.

```rust
extern {
    pub type Opaque;
}

pub unsafe trait GlobalAlloc {
    unsafe fn alloc(&self, layout: Layout) -> *mut Opaque;
    unsafe fn dealloc(&self, ptr: *mut Opaque, layout: Layout);

    unsafe fn alloc_zeroed(&self, layout: Layout) -> *mut Opaque {
        // Default impl: self.alloc() and ptr::write_bytes()
    }
    unsafe fn realloc(&self, ptr: *mut Opaque, old_layout: Layout, new_size: usize) -> *mut Opaque {
        // Default impl: self.alloc() and ptr::copy_nonoverlapping() and self.dealloc()
    }

    fn oom(&self) -> ! {
        // intrinsics::abort
    }

    // More methods with default impls may be added in the future
}
```

## Bikeshed

The tracking issue #49668 lists some open questions. If consensus is reached before this PR is merged, changes can be integrated.
  • Loading branch information
bors committed Apr 13, 2018
2 parents f9f9050 + c5ffdd7 commit 99d4886
Show file tree
Hide file tree
Showing 56 changed files with 1,038 additions and 1,496 deletions.
3 changes: 0 additions & 3 deletions src/Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion src/dlmalloc
Submodule dlmalloc updated 4 files
+1 −0 Cargo.toml
+4 −4 src/global.rs
+70 −96 src/lib.rs
+1 −1 tests/smoke.rs
2 changes: 1 addition & 1 deletion src/doc/nomicon
Original file line number Diff line number Diff line change
Expand Up @@ -29,16 +29,17 @@ looks like:
```rust
#![feature(global_allocator, allocator_api, heap_api)]

use std::heap::{Alloc, System, Layout, AllocErr};
use std::alloc::{GlobalAlloc, System, Layout, Opaque};
use std::ptr::NonNull;

struct MyAllocator;

unsafe impl<'a> Alloc for &'a MyAllocator {
unsafe fn alloc(&mut self, layout: Layout) -> Result<*mut u8, AllocErr> {
unsafe impl GlobalAlloc for MyAllocator {
unsafe fn alloc(&self, layout: Layout) -> *mut Opaque {
System.alloc(layout)
}

unsafe fn dealloc(&mut self, ptr: *mut u8, layout: Layout) {
unsafe fn dealloc(&self, ptr: *mut Opaque, layout: Layout) {
System.dealloc(ptr, layout)
}
}
Expand Down
215 changes: 215 additions & 0 deletions src/liballoc/alloc.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,215 @@
// Copyright 2014-2015 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.

#![unstable(feature = "allocator_api",
reason = "the precise API and guarantees it provides may be tweaked \
slightly, especially to possibly take into account the \
types being stored to make room for a future \
tracing garbage collector",
issue = "32838")]

use core::intrinsics::{min_align_of_val, size_of_val};
use core::ptr::NonNull;
use core::usize;

#[doc(inline)]
pub use core::alloc::*;

#[cfg(stage0)]
extern "Rust" {
#[allocator]
#[rustc_allocator_nounwind]
fn __rust_alloc(size: usize, align: usize, err: *mut u8) -> *mut u8;
#[cold]
#[rustc_allocator_nounwind]
fn __rust_oom(err: *const u8) -> !;
#[rustc_allocator_nounwind]
fn __rust_dealloc(ptr: *mut u8, size: usize, align: usize);
#[rustc_allocator_nounwind]
fn __rust_realloc(ptr: *mut u8,
old_size: usize,
old_align: usize,
new_size: usize,
new_align: usize,
err: *mut u8) -> *mut u8;
#[rustc_allocator_nounwind]
fn __rust_alloc_zeroed(size: usize, align: usize, err: *mut u8) -> *mut u8;
}

#[cfg(not(stage0))]
extern "Rust" {
#[allocator]
#[rustc_allocator_nounwind]
fn __rust_alloc(size: usize, align: usize) -> *mut u8;
#[cold]
#[rustc_allocator_nounwind]
fn __rust_oom() -> !;
#[rustc_allocator_nounwind]
fn __rust_dealloc(ptr: *mut u8, size: usize, align: usize);
#[rustc_allocator_nounwind]
fn __rust_realloc(ptr: *mut u8,
old_size: usize,
align: usize,
new_size: usize) -> *mut u8;
#[rustc_allocator_nounwind]
fn __rust_alloc_zeroed(size: usize, align: usize) -> *mut u8;
}

#[derive(Copy, Clone, Default, Debug)]
pub struct Global;

#[unstable(feature = "allocator_api", issue = "32838")]
#[rustc_deprecated(since = "1.27.0", reason = "type renamed to `Global`")]
pub type Heap = Global;

#[unstable(feature = "allocator_api", issue = "32838")]
#[rustc_deprecated(since = "1.27.0", reason = "type renamed to `Global`")]
#[allow(non_upper_case_globals)]
pub const Heap: Global = Global;

unsafe impl GlobalAlloc for Global {
#[inline]
unsafe fn alloc(&self, layout: Layout) -> *mut Opaque {
#[cfg(not(stage0))]
let ptr = __rust_alloc(layout.size(), layout.align());
#[cfg(stage0)]
let ptr = __rust_alloc(layout.size(), layout.align(), &mut 0);
ptr as *mut Opaque
}

#[inline]
unsafe fn dealloc(&self, ptr: *mut Opaque, layout: Layout) {
__rust_dealloc(ptr as *mut u8, layout.size(), layout.align())
}

#[inline]
unsafe fn realloc(&self, ptr: *mut Opaque, layout: Layout, new_size: usize) -> *mut Opaque {
#[cfg(not(stage0))]
let ptr = __rust_realloc(ptr as *mut u8, layout.size(), layout.align(), new_size);
#[cfg(stage0)]
let ptr = __rust_realloc(ptr as *mut u8, layout.size(), layout.align(),
new_size, layout.align(), &mut 0);
ptr as *mut Opaque
}

#[inline]
unsafe fn alloc_zeroed(&self, layout: Layout) -> *mut Opaque {
#[cfg(not(stage0))]
let ptr = __rust_alloc_zeroed(layout.size(), layout.align());
#[cfg(stage0)]
let ptr = __rust_alloc_zeroed(layout.size(), layout.align(), &mut 0);
ptr as *mut Opaque
}

#[inline]
fn oom(&self) -> ! {
unsafe {
#[cfg(not(stage0))]
__rust_oom();
#[cfg(stage0)]
__rust_oom(&mut 0);
}
}
}

unsafe impl Alloc for Global {
#[inline]
unsafe fn alloc(&mut self, layout: Layout) -> Result<NonNull<Opaque>, AllocErr> {
NonNull::new(GlobalAlloc::alloc(self, layout)).ok_or(AllocErr)
}

#[inline]
unsafe fn dealloc(&mut self, ptr: NonNull<Opaque>, layout: Layout) {
GlobalAlloc::dealloc(self, ptr.as_ptr(), layout)
}

#[inline]
unsafe fn realloc(&mut self,
ptr: NonNull<Opaque>,
layout: Layout,
new_size: usize)
-> Result<NonNull<Opaque>, AllocErr>
{
NonNull::new(GlobalAlloc::realloc(self, ptr.as_ptr(), layout, new_size)).ok_or(AllocErr)
}

#[inline]
unsafe fn alloc_zeroed(&mut self, layout: Layout) -> Result<NonNull<Opaque>, AllocErr> {
NonNull::new(GlobalAlloc::alloc_zeroed(self, layout)).ok_or(AllocErr)
}

#[inline]
fn oom(&mut self) -> ! {
GlobalAlloc::oom(self)
}
}

/// The allocator for unique pointers.
// This function must not unwind. If it does, MIR trans will fail.
#[cfg(not(test))]
#[lang = "exchange_malloc"]
#[inline]
unsafe fn exchange_malloc(size: usize, align: usize) -> *mut u8 {
if size == 0 {
align as *mut u8
} else {
let layout = Layout::from_size_align_unchecked(size, align);
let ptr = Global.alloc(layout);
if !ptr.is_null() {
ptr as *mut u8
} else {
Global.oom()
}
}
}

#[cfg_attr(not(test), lang = "box_free")]
#[inline]
pub(crate) unsafe fn box_free<T: ?Sized>(ptr: *mut T) {
let size = size_of_val(&*ptr);
let align = min_align_of_val(&*ptr);
// We do not allocate for Box<T> when T is ZST, so deallocation is also not necessary.
if size != 0 {
let layout = Layout::from_size_align_unchecked(size, align);
Global.dealloc(ptr as *mut Opaque, layout);
}
}

#[cfg(test)]
mod tests {
extern crate test;
use self::test::Bencher;
use boxed::Box;
use alloc::{Global, Alloc, Layout};

#[test]
fn allocate_zeroed() {
unsafe {
let layout = Layout::from_size_align(1024, 1).unwrap();
let ptr = Global.alloc_zeroed(layout.clone())
.unwrap_or_else(|_| Global.oom());

let mut i = ptr.cast::<u8>().as_ptr();
let end = i.offset(layout.size() as isize);
while i < end {
assert_eq!(*i, 0);
i = i.offset(1);
}
Global.dealloc(ptr, layout);
}
}

#[bench]
fn alloc_owned_small(b: &mut Bencher) {
b.iter(|| {
let _: Box<_> = box 10;
})
}
}
23 changes: 9 additions & 14 deletions src/liballoc/arc.rs
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,6 @@ use core::sync::atomic::Ordering::{Acquire, Relaxed, Release, SeqCst};
use core::borrow;
use core::fmt;
use core::cmp::Ordering;
use core::heap::{Alloc, Layout};
use core::intrinsics::abort;
use core::mem::{self, align_of_val, size_of_val, uninitialized};
use core::ops::Deref;
Expand All @@ -32,7 +31,7 @@ use core::hash::{Hash, Hasher};
use core::{isize, usize};
use core::convert::From;

use heap::{Heap, box_free};
use alloc::{Global, Alloc, Layout, box_free};
use boxed::Box;
use string::String;
use vec::Vec;
Expand Down Expand Up @@ -513,15 +512,13 @@ impl<T: ?Sized> Arc<T> {
// Non-inlined part of `drop`.
#[inline(never)]
unsafe fn drop_slow(&mut self) {
let ptr = self.ptr.as_ptr();

// Destroy the data at this time, even though we may not free the box
// allocation itself (there may still be weak pointers lying around).
ptr::drop_in_place(&mut self.ptr.as_mut().data);

if self.inner().weak.fetch_sub(1, Release) == 1 {
atomic::fence(Acquire);
Heap.dealloc(ptr as *mut u8, Layout::for_value(&*ptr))
Global.dealloc(self.ptr.as_opaque(), Layout::for_value(self.ptr.as_ref()))
}
}

Expand Down Expand Up @@ -555,11 +552,11 @@ impl<T: ?Sized> Arc<T> {

let layout = Layout::for_value(&*fake_ptr);

let mem = Heap.alloc(layout)
.unwrap_or_else(|e| Heap.oom(e));
let mem = Global.alloc(layout)
.unwrap_or_else(|_| Global.oom());

// Initialize the real ArcInner
let inner = set_data_ptr(ptr as *mut T, mem) as *mut ArcInner<T>;
let inner = set_data_ptr(ptr as *mut T, mem.as_ptr() as *mut u8) as *mut ArcInner<T>;

ptr::write(&mut (*inner).strong, atomic::AtomicUsize::new(1));
ptr::write(&mut (*inner).weak, atomic::AtomicUsize::new(1));
Expand Down Expand Up @@ -626,7 +623,7 @@ impl<T: Clone> ArcFromSlice<T> for Arc<[T]> {
// In the event of a panic, elements that have been written
// into the new ArcInner will be dropped, then the memory freed.
struct Guard<T> {
mem: *mut u8,
mem: NonNull<u8>,
elems: *mut T,
layout: Layout,
n_elems: usize,
Expand All @@ -640,7 +637,7 @@ impl<T: Clone> ArcFromSlice<T> for Arc<[T]> {
let slice = from_raw_parts_mut(self.elems, self.n_elems);
ptr::drop_in_place(slice);

Heap.dealloc(self.mem, self.layout.clone());
Global.dealloc(self.mem.as_opaque(), self.layout.clone());
}
}
}
Expand All @@ -656,7 +653,7 @@ impl<T: Clone> ArcFromSlice<T> for Arc<[T]> {
let elems = &mut (*ptr).data as *mut [T] as *mut T;

let mut guard = Guard{
mem: mem,
mem: NonNull::new_unchecked(mem),
elems: elems,
layout: layout,
n_elems: 0,
Expand Down Expand Up @@ -1148,8 +1145,6 @@ impl<T: ?Sized> Drop for Weak<T> {
/// assert!(other_weak_foo.upgrade().is_none());
/// ```
fn drop(&mut self) {
let ptr = self.ptr.as_ptr();

// If we find out that we were the last weak pointer, then its time to
// deallocate the data entirely. See the discussion in Arc::drop() about
// the memory orderings
Expand All @@ -1161,7 +1156,7 @@ impl<T: ?Sized> Drop for Weak<T> {
if self.inner().weak.fetch_sub(1, Release) == 1 {
atomic::fence(Acquire);
unsafe {
Heap.dealloc(ptr as *mut u8, Layout::for_value(&*ptr))
Global.dealloc(self.ptr.as_opaque(), Layout::for_value(self.ptr.as_ref()))
}
}
}
Expand Down
Loading

0 comments on commit 99d4886

Please sign in to comment.