Error Not Found Object Assembly Keys
Contents |
Sign in Pricing Blog Support Search GitHub This repository sbt error: not found: value assembly Watch 59 Star 863 Fork 120 sbt/sbt-assembly forked from not found object sbt softprops/assembly-sbt Code Issues 26 Pull requests 0 Projects 0 Pulse Graphs New issue Cannot
Install Sbt-assembly
upgrade to 0.12.0 - Documentation / bug duplicate plugin files #139 Open samthebest opened this Issue Jan 2, 2015 · 1 comment
Not Found Value Assemblysettings
Projects None yet Labels enhancement Milestone No milestone Assignees No one assigned 2 participants samthebest commented Jan 2, 2015 (headers added by @eed3si9n) steps I've changed by plugins.sbt in project-root/project/ from addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2") to addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.12.0") My not a valid command: assembly build is has path project-root/build.sbt problem When I try to then run sbt clean I get: [warn] There may be incompatibilities among your library dependencies. [warn] Here are some of the libraries that were evicted: [warn] * com.eed3si9n:sbt-assembly:0.11.2 -> 0.12.0 [warn] Run 'evicted' to see detailed eviction warnings project-root/build.sbt:1: error: not found: object AssemblyKeys import AssemblyKeys._ ^ sbt.compiler.EvalException: Type error in expression at sbt.compiler.Eval.checkError(Eval.scala:384) at sbt.compiler.Eval.compileAndLoad(Eval.scala:183) at sbt.compiler.Eval.evalCommon(Eval.scala:152) at sbt.compiler.Eval.evalDefinitions(Eval.scala:122) at sbt.EvaluateConfigurations$.evaluateDefinitions(EvaluateConfigurations.scala:254) at sbt.EvaluateConfigurations$.evaluateSbtFile(EvaluateConfigurations.scala:109) ... notes FYI, when I was on 0.11.2 I actually got my build file to work by using a different import statement: import sbtassembly.Plugin.AssemblyKeys._ But now when I try that I get: error: not found: object sbtassembly import sbtassembly.Plugin.AssemblyKeys._ ^ sbt.compiler.EvalException: Type error in expression at sbt.compiler.Eval.checkError(Eval.scala:384) at sbt.compiler.Eval.compileAndLoad(Eval.scala:183) at sbt.compiler.Eval.evalCommon(Eval.scala:152) at sbt.compiler.Eval.evalDefinitions(Eval.scala:122) ... steps2 I've tried reading this: https://
here for a quick overview of the site Help Center Detailed answers to any questions you might have Meta Discuss the workings and policies of this site About Us
Sbt Assemblykeys
Learn more about Stack Overflow the company Business Learn more about hiring developers or (*:assembly) deduplicate: different file contents found in the following posting ads with us Stack Overflow Questions Jobs Documentation Tags Users Badges Ask Question x Dismiss Join the Stack Overflow sbt assembly example Community Stack Overflow is a community of 4.7 million programmers, just like you, helping each other. Join them; it only takes a minute: Sign up What are AssemblyKeys used for, and how to https://github.com/sbt/sbt-assembly/issues/139 import them? up vote 5 down vote favorite In a recent popular sbt app https://github.com/databricks/reference-apps, I found a line that required me to import AssemblyKeys._ This line doesn't compile in SBT or in my IntelliJ IDEA. What is the import used for and why is it necessary? scala sbt apache-spark share|improve this question edited Oct 4 '14 at 19:25 Jacek Laskowski 16.8k454123 asked Oct 4 '14 at 15:39 http://stackoverflow.com/questions/26194358/what-are-assemblykeys-used-for-and-how-to-import-them jayunit100 9,130850106 2 The acute issue is quite simple to solve - you just have to get the project directory structure right. but i think a good explanation of AssemblyKeys and the idiomatic usage of them would still be quite useful as an answer to this question. –jayunit100 Oct 4 '14 at 16:22 add a comment| 3 Answers 3 active oldest votes up vote 2 down vote accepted The other answer by @mfirry pretty much answers what part of the build (definition) brings import AssemblyKeys._. It's the sbt-assembly plugin that (quoting the docs of the plugin): Create a fat JAR of your project with all of its dependencies. It's needed by the plugin to do its job. You may ask yourself why I would need the plugin at all. Since you didn't reference the application that requires the import and hence the plugin, and I didn't review the examples, either, I can only guess by the company Databricks that's the commercial entity behind Apache Spark the examples use. In order to deploy an application onto a cluster of Apache Spark you need to assemble the entire application and configure the workers so they can access the binaries (with
here for a quick overview http://stackoverflow.com/questions/30035983/is-there-any-specific-sbt-version-required-to-compile-the-cassandra-spark-connec of the site Help Center Detailed answers to any http://stackoverflow.com/questions/30470847/object-plugin-is-not-a-member-of-package-sbtassembly questions you might have Meta Discuss the workings and policies of this site About Us Learn more about Stack Overflow the company Business Learn more about hiring developers or posting ads with us Stack Overflow not found Questions Jobs Documentation Tags Users Badges Ask Question x Dismiss Join the Stack Overflow Community Stack Overflow is a community of 4.7 million programmers, just like you, helping each other. Join them; it only takes a minute: Sign up Is there any specific not found object sbt version required to compile the cassandra-spark-connector up vote 1 down vote favorite I am assembling the "Cassandra-Spark-Connector". I just followed the steps below: Git clone connector code Run "sbt assembly" During the assembly phase I am getting the following error: [info] Done updating. [warn] There may be incompatibilities among your library dependencies. [warn] Here are some of the libraries that were evicted: [warn] * com.eed3si9n:sbt-assembly:0.11.2 -> 0.13.0 [warn] Run 'evicted' to see detailed eviction warnings [info] Compiling 5 Scala sources to /home/xxxxxx/Development/iAdLearning/spark-cassandra-connector/project/target/scala-2.10/sbt-0.13/classes... [error] /home/xxxxxx/Development/iAdLearning/spark-cassandra-connector/project/Settings.scala:23: object Plugin is not a member of package sbtassembly [error] import sbtassembly.Plugin._ [error] ^ [error] /home/xxxxxx/Development/iAdLearning/spark-cassandra-connector/project/Settings.scala:24: not found: object AssemblyKeys [error] import AssemblyKeys._ [error] ^ [error] /home/xxxxxx/Development/iAdLearning/spark-cassandra-connector/project/Settings.scala:217: not found: value assemblySettings [error] lazy val sbtAssemblySettings = assemblySettings ++ Seq( [error] ^ [error] three error
here for a quick overview of the site Help Center Detailed answers to any questions you might have Meta Discuss the workings and policies of this site About Us Learn more about Stack Overflow the company Business Learn more about hiring developers or posting ads with us Stack Overflow Questions Jobs Documentation Tags Users Badges Ask Question x Dismiss Join the Stack Overflow Community Stack Overflow is a community of 4.7 million programmers, just like you, helping each other. Join them; it only takes a minute: Sign up object Plugin is not a member of package sbtassembly up vote 3 down vote favorite I'm trying to upgrade my sbt-assebly plugin to 0.13.0. Simultaneously, I'm upgrading sbt from 0.13.5 to 0.13.6. When I try import the sbt assembly keys, I get object Plugin is not a member of package sbtassembly. I have the plugin listed in my project/plugins.sbt file What am I doing wrong? Here's my project/plugins.sbt file: resolvers += "Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/" resolvers += "Sonatype OSS Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots" resolvers += "Typesafe snapshots" at "http://repo.typesafe.com/typesafe/snapshots/" resolvers += Resolver.url("artifactory", url("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns) resolvers += Classpaths.sbtPluginReleases addSbtPlugin("io.spray" % "sbt-revolver" % "0.7.2") addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.3.8") addSbtPlugin("com.typesafe.sbt" % "sbt-coffeescript" % "1.0.0") addSbtPlugin("com.typesafe.sbt" % "sbt-less" % "1.0.0") addSbtPlugin("com.typesafe.sbt" % "sbt-jshint" % "1.0.1") addSbtPlugin("com.typesafe.sbt" % "sbt-rjs" % "1.0.1") addSbtPlugin("com.typesafe.sbt" % "sbt-uglify" % "1.0.3") addSbtPlugin("com.typesafe.sbt" % "sbt-gzip" % "1.0.0") addSbtPlugin("com.typesafe.sbt" % "sbt-digest" % "1.0.0") addSbtPlugin("com.typesafe.sbt" % "sbt-mocha" % "1.0.0") addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0") And here's my Build.scala: import sbt.Keys._ import sbt._ import play._ import play.Play.autoImport._ import PlayKeys._ object GwBuild extends Build { import Dependencies._ import ProjectDefs._ lazy val root = gwRootProject(common, api, crowdsourced, ingestion, users, email, website, adminSite).enablePlugins(PlayScala) lazy val pingy = gwProject("pingy")( Seq(sprayHttpx, akkaActor, sprayCan, sprayRouting) :_* ) lazy val api = gwProject("api")(ws, akkaActor) .dependsOn(common % "compile->compile;test->test", users % "compile->compile;tes