Closed chuwy closed 8 years ago
Very minor tweaks - please make those and then I will get the rc1 into acceptance testing. I'll also ask @fblundun to do full code review - Fred please go for it!
Also please link in the branch with the blog post when it's ready...
@alexanderdean typos has been fixed.
==> default: TASK: [aws-emr-cli | Ensure EMR CLI is extracted] *****************************
==> default: failed: [127.0.0.1] => {"changed": true, "cmd": " unzip elastic-mapreduce-ruby.zip -d /vagrant/emr-cli/ ", "delta": "0:00:00.366176", "end": "2016-03-28 13:18:58.342546", "rc": 1, "start": "2016-03-28 13:18:57.976370"}
==> default: stderr: replace /vagrant/emr-cli/net/http/connection_pool.rb? [y]es, [n]o, [A]ll, [N]one, [r]ename: NULL
==> default: (EOF or read error, treating as "[N]one" ...)
==> default: stdout: Archive: elastic-mapreduce-ruby.zip
==> default:
==> default: FATAL: all hosts have already failed -- aborting
==> default:
==> default: PLAY RECAP ********************************************************************
==> default: to retry, use: --limit @/home/vagrant/aws-tools.retry
==> default:
==> default: 127.0.0.1 : ok=17 changed=12 unreachable=0 failed=1
The SSH command responded with a non-zero exit status. Vagrant
assumes that this means the command failed. The output for this command
should be in the log above. Please read the output to determine what
went wrong.
Solution: ticket to remove aws-tools from the vagrant.up
@alexanderdean can we fix a playbook? Or do you think it shouldn't even belong here?
Actually, yes, removing it makes sense. We need to run fatjar from a machine with keys and credentials anyway.
Yep, the one in question uses the long-deprecated EMR CLI: https://github.com/snowplow/ansible-playbooks/blob/master/aws-tools.yml#L6
So even if we need to replace it with something else in the future, I think we should remove it...
When I attempt vagrant push
I get:
[info] Compiling 32 Scala sources to /vagrant/target/scala-2.10/classes...
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/Migrations.scala:27: imported `Schema' is permanently hidden by definition of object Schema in package schemaguru
[error] import Common.{ TextFile, Schema, splitValidations }
[error] ^
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/Migrations.scala:54: type mismatch;
[error] found : x$1.type (with underlying type com.snowplowanalytics.schemaguru.Schema)
[error] required: ?{def self: ?}
[error] Note that implicit conversions are not applicable because they are ambiguous:
[error] both method any2stringfmt in object Predef of type (x: Any)scala.runtime.StringFormat
[error] and method any2stringadd in object Predef of type (x: Any)scala.runtime.StringAdd
[error] are possible conversion functions from x$1.type to ?{def self: ?}
[error] implicit val schemaOrdering = implicitly[Order[Int]].contramap[Schema](_.self.version.addition)
[error] ^
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/Migrations.scala:54: value self is not a member of com.snowplowanalytics.schemaguru.Schema
[error] implicit val schemaOrdering = implicitly[Order[Int]].contramap[Schema](_.self.version.addition)
[error] ^
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/Migrations.scala:68: value revisionCriterion is not a member of com.snowplowanalytics.schemaguru.Schema
[error] schemas.groupBy(_.revisionCriterion)
[error] ^
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/Migrations.scala:80: value data is not a member of com.snowplowanalytics.schemaguru.Schema
[error] val flatSource = flattenJsonSchema(sourceSchema.data, splitProduct = false).map(_.elems)
[error] ^
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/Migrations.scala:81: value data is not a member of com.snowplowanalytics.schemaguru.Schema
[error] val flatSuccessive = successiveSchemas.map(s => flattenJsonSchema(s.data, splitProduct = false)).sequenceU
[error] ^
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/Migrations.scala:81: Implicit not found: scalaz.Unapply[scalaz.Applicative, A]. Unable to unapply type `A` into a type constructor of kind `M[_]` that is classified by the type class `scalaz.Applicative`. Check that the type class is defined by compiling `implicitly[scalaz.Applicative[type constructor]]` and review the implicits in object Unapply, which only cover common type 'shapes.'
[error] val flatSuccessive = successiveSchemas.map(s => flattenJsonSchema(s.data, splitProduct = false)).sequenceU
[error] ^
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/Migrations.scala:211: type mismatch;
[error] found : source.type (with underlying type com.snowplowanalytics.schemaguru.Schema)
[error] required: ?{def self: ?}
[error] Note that implicit conversions are not applicable because they are ambiguous:
[error] both method any2stringfmt in object Predef of type (x: Any)scala.runtime.StringFormat
[error] and method any2stringadd in object Predef of type (x: Any)scala.runtime.StringAdd
[error] are possible conversion functions from source.type to ?{def self: ?}
[error] } yield (source.self, buildMigration(source, targets))
[error] ^
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/Migrations.scala:211: value self is not a member of com.snowplowanalytics.schemaguru.Schema
[error] } yield (source.self, buildMigration(source, targets))
[error] ^
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/Migrations.scala:212: value _1 is not a member of Nothing
[error] migrations.groupBy(_._1)
[error] ^
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/SchemaCodecs.scala:20: imported `Schema' is permanently hidden by definition of object Schema in package schemaguru
[error] import Common.{ Schema, SchemaDescription, SchemaVer }
[error] ^
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/SchemaCodecs.scala:60: type mismatch;
[error] found : com.snowplowanalytics.schemaguru.Common.SchemaDescription
[error] required: com.snowplowanalytics.schemaguru.schema.JsonSchema
[error] Schema(desc, JObject(cleanSchema))
[error] ^
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/SchemaCodecs.scala:66: type mismatch;
[error] found : x.type (with underlying type com.snowplowanalytics.schemaguru.Schema)
[error] required: ?{def self: ?}
[error] Note that implicit conversions are not applicable because they are ambiguous:
[error] both method any2stringfmt in object Predef of type (x: Any)scala.runtime.StringFormat
[error] and method any2stringadd in object Predef of type (x: Any)scala.runtime.StringAdd
[error] are possible conversion functions from x.type to ?{def self: ?}
[error] case x: Schema => JObject(JField("self", Extraction.decompose(x.self)) :: x.data.obj)
[error] ^
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/SchemaGuru.scala:23: imported `JsonConvertResult' is permanently hidden by definition of object JsonConvertResult in package schemaguru
[error] import Common.{ JsonConvertResult, DerivedSchema, SchemaGuruResult }
[error] ^
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/SchemaGuru.scala:23: imported `SchemaGuruResult' is permanently hidden by definition of object SchemaGuruResult in package schemaguru
[error] import Common.{ JsonConvertResult, DerivedSchema, SchemaGuruResult }
[error] ^
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/SchemaGuru.scala:83: type mismatch;
[error] found : com.snowplowanalytics.schemaguru.Common.DerivedSchema
[error] required: com.snowplowanalytics.schemaguru.schema.JsonSchema
[error] SchemaGuruResult(DerivedSchema(schema, None), jsonConvertResult.errors, Some(PossibleDuplicatesWarning(duplicates)))
[error] ^
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/cli/DdlCommand.scala:79: type mismatch;
[error] found : List[com.snowplowanalytics.schemaguru.JsonFile]
[error] required: List[com.snowplowanalytics.schemaguru.Common.JsonFile]
[error] if (rawMode) transformRaw(someJsons)
[error] ^
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/cli/DdlCommand.scala:80: type mismatch;
[error] found : List[com.snowplowanalytics.schemaguru.JsonFile]
[error] required: List[com.snowplowanalytics.schemaguru.Common.JsonFile]
[error] else transformSelfDescribing(someJsons)
[error] ^
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/cli/DdlCommand.scala:105: type mismatch;
[error] found : List[com.snowplowanalytics.schemaguru.Common.Schema]
[error] required: List[com.snowplowanalytics.schemaguru.Schema]
[error] val migrationMap = Migrations.buildMigrationMap(schemas)
[error] ^
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/cli/SchemaCommand.scala:148: value describe is not a member of com.snowplowanalytics.schemaguru.SchemaGuruResult
[error] possible cause: maybe a semicolon is missing before `value describe'?
[error] .describe(vendor, name, schemaver)
[error] ^
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/utils/FileSystemJsonGetters.scala:130: type mismatch;
[error] found : List[scalaz.Validation[String,com.snowplowanalytics.schemaguru.Common.JsonFile]]
[error] required: com.snowplowanalytics.schemaguru.ValidJsonFileList
[error] (which expands to) List[scalaz.Validation[String,com.snowplowanalytics.schemaguru.JsonFile]]
[error] else if (file.isDirectory) for { f <- listAllFiles(file) } yield getJsonFile(f)
[error] ^
[error] /vagrant/src/main/scala/com.snowplowanalytics/schemaguru/utils/FileSystemJsonGetters.scala:131: type mismatch;
[error] found : scalaz.Validation[String,com.snowplowanalytics.schemaguru.Common.JsonFile]
[error] required: scalaz.Validation[String,com.snowplowanalytics.schemaguru.JsonFile]
[error] else getJsonFile(file) :: Nil
[error] ^
[error] 22 errors found
[error] (schema-guru/compile:compileIncremental) Compilation failed
Weird. I had similar errors when I compiled it on local machine and after that tried to recompile in vagrant VM without cleaning target dir.
I'll try cleaning the target dir
Also, is it possible that you have some files from old version not modified? It tries to find com.snowplowanalytics.schemaguru.Schema
which was defined in package.scala
previously, but now moved to Common
module.
Cleaning the target dir fixed it. The next error:
java.io.IOException: Cannot run program "gulp" (in directory "webui/src/main/resources/web"): error=2, No such file or directory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
at scala.sys.process.ProcessBuilderImpl$Simple.run(ProcessBuilderImpl.scala:68)
at scala.sys.process.ProcessBuilderImpl$AbstractBuilder.$bang(ProcessBuilderImpl.scala:112)
at scala.sys.process.ProcessBuilderImpl$AbstractBuilder.slurp(ProcessBuilderImpl.scala:128)
at scala.sys.process.ProcessBuilderImpl$AbstractBuilder.$bang$bang(ProcessBuilderImpl.scala:101)
at WebuiBuildSettings$$anonfun$1.apply$mcV$sp(WebuiBuildSettings.scala:60)
at WebuiBuildSettings$$anonfun$1.apply(WebuiBuildSettings.scala:58)
at WebuiBuildSettings$$anonfun$1.apply(WebuiBuildSettings.scala:58)
at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:44)
at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:44)
at sbt.std.Transform$$anon$4.work(System.scala:63)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
at sbt.Execute.work(Execute.scala:237)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: error=2, No such file or directory
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.<init>(UNIXProcess.java:187)
at java.lang.ProcessImpl.start(ProcessImpl.java:130)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
at scala.sys.process.ProcessBuilderImpl$Simple.run(ProcessBuilderImpl.scala:68)
at scala.sys.process.ProcessBuilderImpl$AbstractBuilder.$bang(ProcessBuilderImpl.scala:112)
at scala.sys.process.ProcessBuilderImpl$AbstractBuilder.slurp(ProcessBuilderImpl.scala:128)
at scala.sys.process.ProcessBuilderImpl$AbstractBuilder.$bang$bang(ProcessBuilderImpl.scala:101)
at WebuiBuildSettings$$anonfun$1.apply$mcV$sp(WebuiBuildSettings.scala:60)
at WebuiBuildSettings$$anonfun$1.apply(WebuiBuildSettings.scala:58)
at WebuiBuildSettings$$anonfun$1.apply(WebuiBuildSettings.scala:58)
at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:44)
at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:44)
at sbt.std.Transform$$anon$4.work(System.scala:63)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
at sbt.Execute.work(Execute.scala:237)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
On which exact step it fails? assemble_fatjar
webui? Are you sure all playbooks were executed? Do you have a gulp command in vagrant box?
Yep assemble_fatjar webui. I have destroyed the Vagrant environment and am bringing it up again.
Everything's been pushed now!
@chuwy the release is working well for me - just waiting on the blog post and @fblundun's code review...
@chuwy it looks good to me - I don't have any suggestions besides the file overwriting thing.
New features
--force
flag (#141). Without it Schema Guru will not override any table definitions of migrations if there was any manual changes (ignoring comments and white spaces)Internal changes
input
argument is no more required be last, now we can add options after it (run schema --schema-by "$.path" {{input}} --enum 3
). As a downside, Spark job now receives just a flag--enum-sets
instead of multiple options (https://github.com/scopt/scopt/issues/103)Blog post: https://github.com/snowplow/snowplow.github.com/pull/292