While TensorFlow models are typically defined and trained using R or Python code, it is possible to deploy TensorFlow models in a wide variety of environments without any runtime dependency on R or Python:
TensorFlow Serving is an open-source software library for serving TensorFlow models using a gRPC interface.
CloudML is a managed cloud service that serves TensorFlow models using a REST interface.
RStudio Connect provides support for serving models using the same REST API as CloudML, but on a server within your own organization.
TensorFlow models can also be deployed to mobile and embedded devices including iOS and Android mobile phones and Raspberry Pi computers. The tfdeploy package includes a variety of tools designed to make exporting and serving TensorFlow models straightforward. For documentation on using tfdeploy, see the package website at https://tensorflow.rstudio.com/tools/tfdeploy/.