Open retronym opened 6 years ago
Interesting, thanks for the info! I don't think anybody ever looked at optimizing the code - if you would have any data on what specifically takes so long, that could be helpful.
Okay, I've gotten past my profiing problems.
Lambda$NNNN
classnames which make profiles harder to read.Here's what I came out with:
$ cat /tmp/profile.csv | grep wire_impl | perl -pne 's/.*wire_impl/wire_impl/g' | perl -pne 's/\$\$Lambda\$\d+\/\d+/LambdaX.X/' | /code/FlameGraph/flamegraph.pl --minwidth 2 --width 2000 - > /tmp/flame.svg
$ cat /tmp/profile.csv | grep wire_impl | perl -pne 's/.*wire_impl/wire_impl/g' | perl -pne 's/\$\$Lambda\$\d+\/\d+/LambdaX.X/g' | /code/FlameGraph/flamegraph.pl --minwidth 2 --width 2000 - > /tmp/flame.svg
https://drive.google.com/drive/folders/1T5cWwKljhDuoyqmvovdb3CllCz1Dfkrm?usp=sharing
That still shows that most of the time is spent in one-off costs of classloading and linking the macro implementation. Maybe because the implementation is quite lambda heavy, macwire pays a bigger cost than other macros?
We're slowly getting towards a situation where macros classloaders could be reused in later compilation runs.
This could be changed to use .declarations
, constructors are not inherited and calling .members
needs to do more work.
If i'm reading things correctly, I think the problem is that for each call to the wire
macro, EligibleValuesFinder
will typecheck duplicates of the DefTree
-s in the body of the enclosing template. That sounds like O(N^2)
complexity to me.
In guardian/frontend
, a typical wiring looks like:
import akka.actor.ActorSystem
import app.{FrontendApplicationLoader, FrontendComponents}
import assets.DiscussionExternalAssetsLifecycle
import com.softwaremill.macwire._
import common.dfp.DfpAgentLifecycle
import common.{ApplicationMetrics, CloudWatchMetricsLifecycle, ContentApiMetrics, EmailSubsciptionMetrics}
import _root_.commercial.targeting.TargetingLifecycle
import common.Logback.{LogbackOperationsPool, LogstashLifecycle}
import common.commercial.OrielCacheLifecycle
import conf.CachedHealthCheckLifeCycle
import conf.switches.SwitchboardLifecycle
import contentapi.{CapiHttpClient, ContentApiClient, HttpClient, SectionsLookUp, SectionsLookUpLifecycle}
import controllers._
import dev.{DevAssetsController, DevParametersHttpRequestHandler}
import http.{CommonFilters, CorsHttpErrorHandler}
import jobs.{SiteMapJob, SiteMapLifecycle}
import model.{ApplicationContext, ApplicationIdentity}
import services.ophan.SurgingContentAgentLifecycle
import play.api.ApplicationLoader.Context
import play.api.BuiltInComponentsFromContext
import play.api.http.{HttpErrorHandler, HttpRequestHandler}
import play.api.libs.ws.WSClient
import play.api.mvc.EssentialFilter
import play.api.routing.Router
import services._
import router.Routes
import scala.concurrent.ExecutionContext
class AppLoader extends FrontendApplicationLoader {
override def buildComponents(context: Context): FrontendComponents = new BuiltInComponentsFromContext(context) with AppComponents
}
trait ApplicationsServices {
def wsClient: WSClient
implicit val executionContext: ExecutionContext
lazy val capiHttpClient: HttpClient = wire[CapiHttpClient]
lazy val contentApiClient = wire[ContentApiClient]
lazy val siteMapJob = wire[SiteMapJob]
lazy val sectionsLookUp = wire[SectionsLookUp]
lazy val ophanApi = wire[OphanApi]
lazy val facebookGraphApiClient = wire[FacebookGraphApiClient]
lazy val facebookGraphApi = wire[FacebookGraphApi]
}
trait AppComponents extends FrontendComponents with ApplicationsControllers with ApplicationsServices {
lazy val devAssetsController = wire[DevAssetsController]
lazy val healthCheck = wire[HealthCheck]
lazy val emailSignupController = wire[EmailSignupController]
lazy val surveyPageController = wire[SurveyPageController]
lazy val signupPageController = wire[SignupPageController]
lazy val logbackOperationsPool = wire[LogbackOperationsPool]
override lazy val lifecycleComponents = List(
wire[LogstashLifecycle],
wire[ConfigAgentLifecycle],
wire[CloudWatchMetricsLifecycle],
wire[DfpAgentLifecycle],
wire[SurgingContentAgentLifecycle],
wire[IndexListingsLifecycle],
wire[SectionsLookUpLifecycle],
wire[SwitchboardLifecycle],
wire[SiteMapLifecycle],
wire[CachedHealthCheckLifeCycle],
wire[TargetingLifecycle],
wire[DiscussionExternalAssetsLifecycle],
wire[SkimLinksCacheLifeCycle],
wire[OrielCacheLifecycle]
)
lazy val router: Router = wire[Routes]
lazy val appIdentity = ApplicationIdentity("applications")
override lazy val appMetrics = ApplicationMetrics(
ContentApiMetrics.HttpTimeoutCountMetric,
ContentApiMetrics.HttpLatencyTimingMetric,
ContentApiMetrics.ContentApiErrorMetric,
ContentApiMetrics.ContentApiRequestsMetric,
EmailSubsciptionMetrics.EmailSubmission,
EmailSubsciptionMetrics.EmailFormError,
EmailSubsciptionMetrics.NotAccepted,
EmailSubsciptionMetrics.APIHTTPError,
EmailSubsciptionMetrics.APINetworkError,
EmailSubsciptionMetrics.ListIDError,
EmailSubsciptionMetrics.AllEmailSubmission
)
override lazy val httpErrorHandler: HttpErrorHandler = wire[CorsHttpErrorHandler]
override lazy val httpFilters: Seq[EssentialFilter] = wire[CommonFilters].filters
override lazy val httpRequestHandler: HttpRequestHandler = wire[DevParametersHttpRequestHandler]
def actorSystem: ActorSystem
}
Maybe explicit type ascriptions on the members avoid that typechecking?
Hm if the typechecks aren't cached (by the compiler) that might indeed be n^2. There's always the option of introducing a magnolia-style global cache which would remember previous type check results?
I'm adding compilation time tracing to
scalac
to show how long each file/class/method takes to typecheck. The trace drills down further to measure time spent in implicit searches and in macro expansions.I generated traces for a relatively large Scala build: https://github.com/guardian/frontend. This project uses MacWire for DI within a Play application.
The generated report suggests that 4% of total compilation time is spent in the
wire
macro:I'm having trouble getting a good look at this through Flight Recorder to make concrete suggestions about what could be optimized, but I thought I'd lodge this ticket as-is in the meantime.