MeltwaterArchive / dropwizard-extra

A set of miscellaneous and common Dropwizard utilities
109 stars 45 forks source link

Release of 0.7.0 Version #18

Open ramv opened 10 years ago

ramv commented 10 years ago

@nicktelford Can you please let me know when the new version of the plugin will be released?

Why was the Configuration subclass removed?

nicktelford commented 10 years ago

I'm working on it, but I've been a little busy.

Which Configuration sub-class are you referring to? And which modules do you use?

ramv commented 10 years ago

I am referring to KafkaConsumerConfiguration and KafkaProducerConfiguration. With the latest code I need to do

    public void run(Configuration configuration, Environment environment) throws Exception {

        //KafkaConsumerFactory kcf = new KafkaConsumerFactory();
        final Validator validator = Validation.buildDefaultValidatorFactory().getValidator();

        environment.jersey().register(new KafkaEnricherResource());

        // Build the consumer
        //TODO: Get the name (kafka-example.yml) from the config file
        KafkaConsumerFactory kcf = new ConfigurationFactory<>(KafkaConsumerFactory.class, validator,
                Jackson.newObjectMapper(),
                DROPWIZARD_PREFIX)
                .build(new File(System.getProperty(USER_DIR)
                        + File.separator
                        + "kafka-consumer.yml"));
        KafkaConsumerFactory.KafkaConsumerBuilder kcb = kcf.processWith(new KafkaStreamProcessor());
        //TODO: Get the name from the config file
        KafkaConsumer consumer = kcb.build(environment, "my-test-consumer");
    }

with 0.6.2 I was able to do

 public void run(KafkaConsumerConfiguration configuration, Environment environment) throws Exception {
       KafkaConsumerFactory kcf = new KafkaConsumerFactory(environment);
       KafkaConsumerFactory.KafkaConsumerBuilder kcb = kcf.processWith(new KafkaStreamProcessor());
       KafkaConsumer consumer = kcb.build(configuration, "my-test-consumer);
}

The advantage here is that I can use only one configuration file for storing all information. With the new way, I need to separate out the server configuration from consumer configuration and producer configuration

any reason why the deviation from the dropwizard convention was introduced?

ramv commented 10 years ago

And which modules do you use?

I am using dropwizard-extra-kafka

nicktelford commented 10 years ago

I see. I'm curious as to why you keep your Kafka consumer configuration outside of your main application config?

The reason this changed is to bring it in line with the new idioms used in Dropwizard 0.7. You'll see a similar idiom being used in most of the upstream "Factory" classes.

ramv commented 10 years ago

I can't put the Kafka configuration along with my application configuration because KafkaConsumerFactory is not a subclass of Configuration. So when I declare public KafkaService extends Application<KafkaConsumerFactory> I get a comple error

I see. I'm curious as to why you keep your Kafka consumer configuration outside of your main application config?

The reason this changed is to bring it in line with the new idioms used in Dropwizard 0.7. You'll see a similar idiom being used in most of the upstream "Factory" classes.

nicktelford commented 10 years ago

That's not what I meant. In Dropwizard, it's idiomatic to have a single Configuration sub-class, that composes the various things you need:

public class MyConfiguration extends Configuration {
    private KafkaConsumerFactory kafka = new KafkaConsumerFactory();

    @JsonProperty("kafka")
    public KafkaConsumerFactory getKafkaConsumerFactory() {
        return kafka;
    }

    @JsonProperty("kafka")
    public void setKafkaConsumerFactory(KafkaConsumerFactory factory) {
        this.kafka = factory;
    }
}
public class MyApplication extends Application<MyConfiguration> {
    @Override
    public void run(MyConfiguration configuration, Environment environment) {
        StreamProcessor myProcessor = // your StreamProcessor implementation
        KafkaConsumer consumer = configuration.getKafkaConsumerFactory()
                .processWith(myProcessor)
                .build(environment);
    }
}
nicktelford commented 10 years ago

Also, FYI, the dropwizard-extra-kafka module will only build against Kafka 0.8.1.1; so if you're using Kafka 0.7 you will need to fork it and port it yourself.

mjwillson commented 10 years ago

Hiya -- also wondering if there's any chance of a release of the 0.7.1 version?

I'd like to try the project out without being stuck on dropwizard 0.6...

nicktelford commented 10 years ago

So the major blocker atm is a lack of testing, though you'll notice that master is now built against Dropwizard 0.7.

I'm building a number of applications off master atm, and once I have them tested and any bugs ironed out I'll cut a release for the 0.7.x series.

olvesh commented 9 years ago

Also waiting for a release of 0.7.x compatible extras - what is holding it back? Anything you want tested? I am looking into using the kafka modules at the moment..

kvanvranken commented 9 years ago

I am waiting on the newer version of the kafka bundle for kafka v8.1.1. Assuming since its in the same dev branch it will be release with the rest of the 0.7.x release?

Thanks!

ramv commented 9 years ago

@nicktelford how can I help test the code so that we can get a release pushed out?

ramv commented 9 years ago

@nicktelford can you please let us know when you can push a release out? What can I do to help?

ramv commented 9 years ago

@nicktelford can you please respond? What can I do to help you push a release out?