Skip to content

Rollup of 7 pull requests #68319

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 22 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
22 commits
Select commit Hold shift + click to select a range
6421127
Update rustc-guide
JohnTitor Jan 6, 2020
12f029b
Fix deref impl on type alias
GuillaumeGomez Jan 10, 2020
fd4a88f
Add test for typedef deref
GuillaumeGomez Jan 10, 2020
81a5b94
formatting
GuillaumeGomez Jan 10, 2020
6e79146
remove unneeded code from cache.rs
GuillaumeGomez Jan 15, 2020
e6ad49a
Include type alias implementations
GuillaumeGomez Jan 15, 2020
d755238
Simplify deref impls for type aliases
GuillaumeGomez Jan 15, 2020
8a9b951
Fix rendering on sidebar and update tests
GuillaumeGomez Jan 15, 2020
25a8f94
Don't warn about snake case for field puns that don't introduce a new…
jumbatm Nov 23, 2019
3094c37
Improve code readability
GuillaumeGomez Jan 16, 2020
cdc828e
Stop treating `FalseEdges` and `FalseUnwind` as having semantic value…
oli-obk Jan 17, 2020
a91f77c
Add regression test for integer literals in generic arguments in wher…
varkor Jan 17, 2020
9c6c2f1
Clean up E0198 explanation
GuillaumeGomez Jan 16, 2020
482dc77
formatting
GuillaumeGomez Jan 17, 2020
e8a4b93
Clean up E0199 explanation
GuillaumeGomez Jan 17, 2020
2ec99ce
Rollup merge of #66660 - jumbatm:dont_warn_about_snake_case_in_patter…
tmandry Jan 17, 2020
ce3bfeb
Rollup merge of #67940 - JohnTitor:rustc-guide, r=JohnTitor
tmandry Jan 17, 2020
7f44666
Rollup merge of #68093 - GuillaumeGomez:fix-deref-impl-typedef, r=oli…
tmandry Jan 17, 2020
fe413c7
Rollup merge of #68279 - GuillaumeGomez:clean-up-e0198, r=Dylan-DPC
tmandry Jan 17, 2020
66ce826
Rollup merge of #68312 - varkor:issue-67753-regression, r=Centril
tmandry Jan 17, 2020
0f849b9
Rollup merge of #68314 - oli-obk:true_unwind, r=eddyb
tmandry Jan 17, 2020
b7af138
Rollup merge of #68317 - GuillaumeGomez:clean-up-e0199, r=Dylan-DPC
tmandry Jan 17, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion src/doc/rustc-guide
17 changes: 9 additions & 8 deletions src/librustc_error_codes/error_codes/E0198.md
Original file line number Diff line number Diff line change
@@ -1,17 +1,18 @@
A negative implementation is one that excludes a type from implementing a
particular trait. Not being able to use a trait is always a safe operation,
so negative implementations are always safe and never need to be marked as
unsafe.
A negative implementation was marked as unsafe.

```compile_fail
#![feature(optin_builtin_traits)]
Erroneous code example:

```compile_fail
struct Foo;

// unsafe is unnecessary
unsafe impl !Clone for Foo { }
unsafe impl !Clone for Foo { } // error!
```

A negative implementation is one that excludes a type from implementing a
particular trait. Not being able to use a trait is always a safe operation,
so negative implementations are always safe and never need to be marked as
unsafe.

This will compile:

```ignore (ignore auto_trait future compatibility warning)
Expand Down
21 changes: 15 additions & 6 deletions src/librustc_error_codes/error_codes/E0199.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,23 @@
A trait implementation was marked as unsafe while the trait is safe.

Erroneous code example:

```compile_fail,E0199
struct Foo;

trait Bar { }

unsafe impl Bar for Foo { } // error!
```

Safe traits should not have unsafe implementations, therefore marking an
implementation for a safe trait unsafe will cause a compiler error. Removing
the unsafe marker on the trait noted in the error will resolve this problem.
the unsafe marker on the trait noted in the error will resolve this problem:

```compile_fail,E0199
```
struct Foo;

trait Bar { }

// this won't compile because Bar is safe
unsafe impl Bar for Foo { }
// this will compile
impl Bar for Foo { }
impl Bar for Foo { } // ok!
```
15 changes: 14 additions & 1 deletion src/librustc_lint/nonstandard_style.rs
Original file line number Diff line number Diff line change
Expand Up @@ -350,7 +350,20 @@ impl<'a, 'tcx> LateLintPass<'a, 'tcx> for NonSnakeCase {
}

fn check_pat(&mut self, cx: &LateContext<'_, '_>, p: &hir::Pat<'_>) {
if let &PatKind::Binding(_, _, ident, _) = &p.kind {
if let &PatKind::Binding(_, hid, ident, _) = &p.kind {
if let hir::Node::Pat(parent_pat) = cx.tcx.hir().get(cx.tcx.hir().get_parent_node(hid))
{
if let PatKind::Struct(_, field_pats, _) = &parent_pat.kind {
for field in field_pats.iter() {
if field.ident != ident {
// Only check if a new name has been introduced, to avoid warning
// on both the struct definition and this pattern.
self.check_snake_case(cx, "variable", &ident);
}
}
return;
}
}
self.check_snake_case(cx, "variable", &ident);
}
}
Expand Down
19 changes: 6 additions & 13 deletions src/librustc_mir/transform/qualify_min_const_fn.rs
Original file line number Diff line number Diff line change
Expand Up @@ -309,21 +309,22 @@ fn check_terminator(
) -> McfResult {
let span = terminator.source_info.span;
match &terminator.kind {
TerminatorKind::Goto { .. } | TerminatorKind::Return | TerminatorKind::Resume => Ok(()),
TerminatorKind::FalseEdges { .. }
| TerminatorKind::FalseUnwind { .. }
| TerminatorKind::Goto { .. }
| TerminatorKind::Return
| TerminatorKind::Resume => Ok(()),

TerminatorKind::Drop { location, .. } => check_place(tcx, location, span, def_id, body),
TerminatorKind::DropAndReplace { location, value, .. } => {
check_place(tcx, location, span, def_id, body)?;
check_operand(tcx, value, span, def_id, body)
}

TerminatorKind::FalseEdges { .. } | TerminatorKind::SwitchInt { .. }
if !feature_allowed(tcx, def_id, sym::const_if_match) =>
{
TerminatorKind::SwitchInt { .. } if !feature_allowed(tcx, def_id, sym::const_if_match) => {
Err((span, "loops and conditional expressions are not stable in const fn".into()))
}

TerminatorKind::FalseEdges { .. } => Ok(()),
TerminatorKind::SwitchInt { discr, switch_ty: _, values: _, targets: _ } => {
check_operand(tcx, discr, span, def_id, body)
}
Expand Down Expand Up @@ -367,13 +368,5 @@ fn check_terminator(
TerminatorKind::Assert { cond, expected: _, msg: _, target: _, cleanup: _ } => {
check_operand(tcx, cond, span, def_id, body)
}

TerminatorKind::FalseUnwind { .. } if feature_allowed(tcx, def_id, sym::const_loop) => {
Ok(())
}

TerminatorKind::FalseUnwind { .. } => {
Err((span, "loops are not allowed in const fn".into()))
}
}
}
16 changes: 16 additions & 0 deletions src/librustdoc/clean/inline.rs
Original file line number Diff line number Diff line change
Expand Up @@ -273,6 +273,22 @@ fn build_type_alias(cx: &DocContext<'_>, did: DefId) -> clean::Typedef {
clean::Typedef {
type_: cx.tcx.type_of(did).clean(cx),
generics: (cx.tcx.generics_of(did), predicates).clean(cx),
item_type: build_type_alias_type(cx, did),
}
}

fn build_type_alias_type(cx: &DocContext<'_>, did: DefId) -> Option<clean::Type> {
let type_ = cx.tcx.type_of(did).clean(cx);
type_.def_id().and_then(|did| build_ty(cx, did))
}

pub fn build_ty(cx: &DocContext, did: DefId) -> Option<clean::Type> {
match cx.tcx.def_kind(did)? {
DefKind::Struct | DefKind::Union | DefKind::Enum | DefKind::Const | DefKind::Static => {
Some(cx.tcx.type_of(did).clean(cx))
}
DefKind::TyAlias => build_type_alias_type(cx, did),
_ => None,
}
}

Expand Down
35 changes: 24 additions & 11 deletions src/librustdoc/clean/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -1122,7 +1122,9 @@ impl Clean<Item> for hir::ImplItem<'_> {
MethodItem((sig, &self.generics, body, Some(self.defaultness)).clean(cx))
}
hir::ImplItemKind::TyAlias(ref ty) => {
TypedefItem(Typedef { type_: ty.clean(cx), generics: Generics::default() }, true)
let type_ = ty.clean(cx);
let item_type = type_.def_id().and_then(|did| inline::build_ty(cx, did));
TypedefItem(Typedef { type_, generics: Generics::default(), item_type }, true)
}
hir::ImplItemKind::OpaqueTy(ref bounds) => OpaqueTyItem(
OpaqueTy { bounds: bounds.clean(cx), generics: Generics::default() },
Expand Down Expand Up @@ -1282,10 +1284,13 @@ impl Clean<Item> for ty::AssocItem {

AssocTypeItem(bounds, ty.clean(cx))
} else {
let type_ = cx.tcx.type_of(self.def_id).clean(cx);
let item_type = type_.def_id().and_then(|did| inline::build_ty(cx, did));
TypedefItem(
Typedef {
type_: cx.tcx.type_of(self.def_id).clean(cx),
type_,
generics: Generics { params: Vec::new(), where_predicates: Vec::new() },
item_type,
},
true,
)
Expand Down Expand Up @@ -1989,6 +1994,8 @@ impl Clean<String> for ast::Name {

impl Clean<Item> for doctree::Typedef<'_> {
fn clean(&self, cx: &DocContext<'_>) -> Item {
let type_ = self.ty.clean(cx);
let item_type = type_.def_id().and_then(|did| inline::build_ty(cx, did));
Item {
name: Some(self.name.clean(cx)),
attrs: self.attrs.clean(cx),
Expand All @@ -1997,10 +2004,7 @@ impl Clean<Item> for doctree::Typedef<'_> {
visibility: self.vis.clean(cx),
stability: cx.stability(self.id).clean(cx),
deprecation: cx.deprecation(self.id).clean(cx),
inner: TypedefItem(
Typedef { type_: self.ty.clean(cx), generics: self.gen.clean(cx) },
false,
),
inner: TypedefItem(Typedef { type_, generics: self.gen.clean(cx), item_type }, false),
}
}
}
Expand Down Expand Up @@ -2101,7 +2105,7 @@ impl Clean<Vec<Item>> for doctree::Impl<'_> {
build_deref_target_impls(cx, &items, &mut ret);
}

let provided = trait_
let provided: FxHashSet<String> = trait_
.def_id()
.map(|did| {
cx.tcx
Expand All @@ -2112,7 +2116,12 @@ impl Clean<Vec<Item>> for doctree::Impl<'_> {
})
.unwrap_or_default();

ret.push(Item {
let for_ = self.for_.clean(cx);
let type_alias = for_.def_id().and_then(|did| match cx.tcx.def_kind(did) {
Some(DefKind::TyAlias) => Some(cx.tcx.type_of(did).clean(cx)),
_ => None,
});
let make_item = |trait_: Option<Type>, for_: Type, items: Vec<Item>| Item {
name: None,
attrs: self.attrs.clean(cx),
source: self.whence.clean(cx),
Expand All @@ -2123,15 +2132,19 @@ impl Clean<Vec<Item>> for doctree::Impl<'_> {
inner: ImplItem(Impl {
unsafety: self.unsafety,
generics: self.generics.clean(cx),
provided_trait_methods: provided,
provided_trait_methods: provided.clone(),
trait_,
for_: self.for_.clean(cx),
for_,
items,
polarity: Some(cx.tcx.impl_polarity(def_id).clean(cx)),
synthetic: false,
blanket_impl: None,
}),
});
};
if let Some(type_alias) = type_alias {
ret.push(make_item(trait_.clone(), type_alias, items.clone()));
}
ret.push(make_item(trait_, for_, items));
ret
}
}
Expand Down
8 changes: 8 additions & 0 deletions src/librustdoc/clean/types.rs
Original file line number Diff line number Diff line change
Expand Up @@ -1406,6 +1406,14 @@ pub struct PathSegment {
pub struct Typedef {
pub type_: Type,
pub generics: Generics,
// Type of target item.
pub item_type: Option<Type>,
}

impl GetDefId for Typedef {
fn def_id(&self) -> Option<DefId> {
self.type_.def_id()
}
}

#[derive(Clone, Debug)]
Expand Down
20 changes: 13 additions & 7 deletions src/librustdoc/html/render.rs
Original file line number Diff line number Diff line change
Expand Up @@ -3469,20 +3469,23 @@ fn render_deref_methods(
deref_mut: bool,
) {
let deref_type = impl_.inner_impl().trait_.as_ref().unwrap();
let target = impl_
let (target, real_target) = impl_
.inner_impl()
.items
.iter()
.filter_map(|item| match item.inner {
clean::TypedefItem(ref t, true) => Some(&t.type_),
clean::TypedefItem(ref t, true) => Some(match *t {
clean::Typedef { item_type: Some(ref type_), .. } => (type_, &t.type_),
_ => (&t.type_, &t.type_),
}),
_ => None,
})
.next()
.expect("Expected associated type binding");
let what =
AssocItemRender::DerefFor { trait_: deref_type, type_: target, deref_mut_: deref_mut };
AssocItemRender::DerefFor { trait_: deref_type, type_: real_target, deref_mut_: deref_mut };
if let Some(did) = target.def_id() {
render_assoc_items(w, cx, container_item, did, what)
render_assoc_items(w, cx, container_item, did, what);
} else {
if let Some(prim) = target.primitive_type() {
if let Some(&did) = cx.cache.primitive_locations.get(&prim) {
Expand Down Expand Up @@ -4123,12 +4126,15 @@ fn sidebar_assoc_items(it: &clean::Item) -> String {
.filter(|i| i.inner_impl().trait_.is_some())
.find(|i| i.inner_impl().trait_.def_id() == c.deref_trait_did)
{
if let Some(target) = impl_
if let Some((target, real_target)) = impl_
.inner_impl()
.items
.iter()
.filter_map(|item| match item.inner {
clean::TypedefItem(ref t, true) => Some(&t.type_),
clean::TypedefItem(ref t, true) => Some(match *t {
clean::Typedef { item_type: Some(ref type_), .. } => (type_, &t.type_),
_ => (&t.type_, &t.type_),
}),
_ => None,
})
.next()
Expand All @@ -4147,7 +4153,7 @@ fn sidebar_assoc_items(it: &clean::Item) -> String {
"{:#}",
impl_.inner_impl().trait_.as_ref().unwrap().print()
)),
Escape(&format!("{:#}", target.print()))
Escape(&format!("{:#}", real_target.print()))
));
out.push_str("</a>");
let mut ret = impls
Expand Down
14 changes: 7 additions & 7 deletions src/librustdoc/html/render/cache.rs
Original file line number Diff line number Diff line change
Expand Up @@ -277,7 +277,7 @@ impl DocFolder for Cache {
| clean::StructFieldItem(..)
| clean::VariantItem(..) => (
(
Some(*self.parent_stack.last().unwrap()),
Some(*self.parent_stack.last().expect("parent_stack is empty")),
Some(&self.stack[..self.stack.len() - 1]),
),
false,
Expand All @@ -286,7 +286,7 @@ impl DocFolder for Cache {
if self.parent_stack.is_empty() {
((None, None), false)
} else {
let last = self.parent_stack.last().unwrap();
let last = self.parent_stack.last().expect("parent_stack is empty 2");
let did = *last;
let path = match self.paths.get(&did) {
// The current stack not necessarily has correlation
Expand Down Expand Up @@ -468,7 +468,7 @@ impl DocFolder for Cache {
self.impls.entry(did).or_insert(vec![]).push(impl_item.clone());
}
} else {
let trait_did = impl_item.trait_did().unwrap();
let trait_did = impl_item.trait_did().expect("no trait did");
self.orphan_trait_impls.push((trait_did, dids, impl_item));
}
None
Expand All @@ -478,10 +478,10 @@ impl DocFolder for Cache {
});

if pushed {
self.stack.pop().unwrap();
self.stack.pop().expect("stack already empty");
}
if parent_pushed {
self.parent_stack.pop().unwrap();
self.parent_stack.pop().expect("parent stack already empty");
}
self.stripped_mod = orig_stripped_mod;
self.parent_is_trait_impl = orig_parent_is_trait_impl;
Expand Down Expand Up @@ -594,7 +594,7 @@ fn build_index(krate: &clean::Crate, cache: &mut Cache) -> String {
for item in search_index {
item.parent_idx = item.parent.map(|nodeid| {
if nodeid_to_pathid.contains_key(&nodeid) {
*nodeid_to_pathid.get(&nodeid).unwrap()
*nodeid_to_pathid.get(&nodeid).expect("no pathid")
} else {
let pathid = lastpathid;
nodeid_to_pathid.insert(nodeid, pathid);
Expand Down Expand Up @@ -639,7 +639,7 @@ fn build_index(krate: &clean::Crate, cache: &mut Cache) -> String {
items: crate_items,
paths: crate_paths,
})
.unwrap()
.expect("failed serde conversion")
)
}

Expand Down
Loading