Open catmando opened 7 years ago
Some initial thoughts:
Use the policies to determine what Models are actually accessible on the client. For those models use AR introspection methods to get relationship and attribute information.
This means for basic use, all you have to do is define the policy to access (which you should do anyway.)
Add a new policy method that regulates remote access to class and instance methods.
Again this means to remotely access a method on the server (from the client) it just needs to white listed in a policy.
class FooPolicy
# allow Foo.some_method to be called remotely if there is a logged in user
allow_rpc_to(:some_method) { acting_user }
# allow Foo.other_method to be called if the first param == acting_user's id
allow_rpc_to(:some_method) { |p1| acting_user.id == p1 }
end
On the client then we can say:
Foo.some_method
and some_method will return a promise or take a block that is executed when the call returns to the client!
And here is an additional set of mechanisms that could also be used:
This is pretty magic but it should be doable:
add the methods server_method
and client_method
to all classes (yep)
both take a name argument and a block.
server_method
creates a class method callable from the client. the class instance variable acting_user
is set to the current acting user for the duration of the block (so its up to the server_method block to check the value and raise an error if necessary.)
client_method
is more fun. its block is recompiled by opal and sent to the client with the rest of the client code (like I said I think this can be done, similar things are done by the hypermesh rspec test helpers.)
The question is, would the magic of client_method
be worth it? With just server_method
and using the models/public directory plus if RUBY_ENGINE
directives you can accomplish the same thing.
What client_method
would add is being able to put the code in any directory and get rid of the clumsy (but very simple to understand) if RUBY_ENGINE
directives...
Here is a real world example (donated by @adamcreekroad from his change-requests app) modified to use server_method and client method
class FileUploader
# the entry point called on the client to get a new direct upload file handle
server_method :create_cloud_temp_url do |file_name|
raise "no user logged in" unless acting_user
uuid = "#{generate_uuid}#{File.extname(file_name)}"
temp_directory.files.create(key: uuid, access_control_allow_origin: '*',
metadata: { original_file_name: file_name })
url = client.get_object_https_url(config[:temp_directory], uuid,
Time.now.to_i + SECONDS_TO_UPLOAD, method: 'PUT')
{ uuid: uuid, upload_url: url }
end
# after you get the file handle update the ui, then begin the upload directly to the cloud
client_method :upload_to_cloud do |file, data, update_prog|
xhr = -> { xhr_progress_handler(update_prog) }
HTTP.put(data[:upload_url], data: file, cache: false, contentType: false,
processData: false, xhr: xhr) do |_response|
copy_from_temp_dir(data[:uuid]) { |url| yield url }
end
end
# private client method to call update_prog with the computed progress
client_method :xhr_progress_handler do |update_prog|
xhr = `new XMLHttpRequest();`
update_progress = lambda do |evt|
compute_length = `#{evt}.lengthComputable`
return unless compute_length
update_prog.call((`#{evt}.loaded / #{evt}.total` * 100).round(2))
end
`#{xhr}.upload.addEventListener("progress", #{update_progress}, false);`
xhr
end
# called by upload_to_cloud when the transfer to the cloud is completed.
server_method :copy_from_temp_dir do |file_name|
raise "no user logged in" unless acting_user
orig_file = temp_file(file_name)
client.copy_object(config[:temp_directory], file_name, config[:directory], file_name,
'Content-Type' => orig_file.content_type)
orig_file.destroy
{ url: "#{config[:https_cdn]}/#{file_name}" }
end
# private server details
class << self
SECONDS_TO_UPLOAD = 300
def client
@client ||= Fog::Storage.new(credentials)
end
def temp_directory
client.directories.get(config[:temp_directory])
end
def directory
client.directories.get(config[:directory])
end
def temp_file(uuid)
temp_directory.files.get(uuid)
end
def file(uuid)
directory.files.get(uuid)
end
def generate_uuid
uuid = nil
loop do
uuid = SecureRandom.hex
break unless file(uuid)
end
uuid
end
def create_cloud_temp_url(file_name)
end
def copy_from_temp_dir(file_name)
end
def config
YAML.load_file("#{Rails.root}/config/fog.yml")[Rails.env].symbolize_keys
end
def credentials
YAML.load_file("#{Rails.root}/config/fog.yml")[Rails.env]['credentials'].symbolize_keys
end
end
end
The other possibility that would work if we just had the allow_rpc_to
method would be to structure the code into a "public" client side file like this:
# app/models/public/file_uploader.rb
class FileUploader
unless RUBY_ENGINE == 'opal'
include HyperMesh::PolicyMethods
extend FileUploaderServerMethods
allow_rpc_access_to :create_cloud_temp_url { |file_name| acting_user }
allow_rpc_access_to :copy_from_temp_dir { |file_name| acting_user }
end
def self.upload_to_cloud(file, data, update_prog|
xhr = -> { xhr_progress_handler(update_prog) }
HTTP.put(data[:upload_url], data: file, cache: false, contentType: false,
processData: false, xhr: xhr) do |_response|
copy_from_temp_dir(data[:uuid]) { |url| yield url }
end
end
private # client method to call update_prog with the computed progress
def self.xhr_progress_handler(update_prog)
xhr = `new XMLHttpRequest();`
update_progress = lambda do |evt|
compute_length = `#{evt}.lengthComputable`
return unless compute_length
update_prog.call((`#{evt}.loaded / #{evt}.total` * 100).round(2))
end
`#{xhr}.upload.addEventListener("progress", #{update_progress}, false);`
xhr
end
end
# app/models/file_uploader_server_methods.rb
module FileUploaderServerMethods
create_cloud_temp_url(file_name)
raise "no user logged in" unless acting_user
uuid = "#{generate_uuid}#{File.extname(file_name)}"
temp_directory.files.create(key: uuid, access_control_allow_origin: '*',
metadata: { original_file_name: file_name })
url = client.get_object_https_url(config[:temp_directory], uuid,
Time.now.to_i + SECONDS_TO_UPLOAD, method: 'PUT')
{ uuid: uuid, upload_url: url }
end
# called by upload_to_cloud when the transfer to the cloud is completed.
def copy_from_temp_dir(file_name)
raise "no user logged in" unless acting_user
orig_file = temp_file(file_name)
client.copy_object(config[:temp_directory], file_name, config[:directory], file_name,
'Content-Type' => orig_file.content_type)
orig_file.destroy
{ url: "#{config[:https_cdn]}/#{file_name}" }
end
# private server details
SECONDS_TO_UPLOAD = 300
def client
@client ||= Fog::Storage.new(credentials)
end
def temp_directory
client.directories.get(config[:temp_directory])
end
def directory
client.directories.get(config[:directory])
end
def temp_file(uuid)
temp_directory.files.get(uuid)
end
def file(uuid)
directory.files.get(uuid)
end
def generate_uuid
uuid = nil
loop do
uuid = SecureRandom.hex
break unless file(uuid)
end
uuid
end
def create_cloud_temp_url(file_name)
end
def copy_from_temp_dir(file_name)
end
def config
YAML.load_file("#{Rails.root}/config/fog.yml")[Rails.env].symbolize_keys
end
def credentials
YAML.load_file("#{Rails.root}/config/fog.yml")[Rails.env]['credentials'].symbolize_keys
end
end
This opening block could be simplified using some convention over configuration:
class FileUploader
extend_on_server do
allow_rpc_access_to :create_cloud_temp_url { |file_name| acting_user }
allow_rpc_access_to :copy_from_temp_dir { |file_name| acting_user }
end
... rest of client methods
end
extend_on_server
would do the include of the policy methods, and would find the app/models/file_uploader.rb
file and require it.
The advantage being that now you can have a server_side class file (even for active record models) that you do not have to touch. If you need client specific extensions you can put them in a second app/public file of the same name and class.
I think I like
allow_clients_to_call :create_cloud_temp_url
better as name, any other ideas?
@noma4i FYI I don't think its possible to simply do this in the components, UNLESS the component files are compiled on the server. Otherwise its a security risk.. Some place you have to have server side code that defines the access rights.
I also know you were getting a ton of issues when moving your models to public... hopefully this will be fixed in a point release coming up shortly.
I prefer to have just one class in an isomorphic rails applications and to be able to tell directly by a look in that class, which methods are client/server-only.
The question is, would the magic of client_method be worth it? With just server_method and using the models/public directory plus if RUBY_ENGINE directives you can accomplish the same thing.
I cannot tell with confidence which of the solutions I prefer, because I am just starting to use hyper-mesh and ReactRB, but to me as a beginner the simpler solution with using if RUBY_ENGINE
seems less confusing at the moment and if it does the job, I am happy and the clumsiness of if RUBY_ENGINE
could be solved with helpers like in the following example maybe?
def client_only
if RUBY_ENGINE == "opal"
yield
end
end
def server_only
if RUBY_ENGINE != "opal"
yield
end
end
then you can just do
class MyModel
client_only do
# your client-only code
end
server_only do
# your server-only code
end
end
I would say go with the easiest solution and put the complex solution aside. Once more people start using hyper-mesh there will be more edge cases popping up and we will have a better image of how we want to deal with this.
there is another question coming to my mind which might be related to this (?): how do we deal with ActiveSupport::Concern
files in app/models/concerns?
there more I play with hyper-mesh, the more it feels like using Active Record introspection to find out about relationships etc is the better choice. Else most applications that have models using external gems for relationship definitions like acts_as_list
won't work.
This is a discussion thread to talk about several related issues / requirements:
1) how to get rid of client specific code from models such as client side scope specifications,
server_method
declarations, etc.2) how to make sure hyper-mesh has access to model information built into gems (like https://github.com/mailboxer/mailboxer)
3) adding decorator (i.e. like draper) type extensions to models
4) adding access to service objects and their methods.
5) get rid of the redundancy implied by having to have a policy to get access to a model, and then also to move the model to the 'public' directory.
6) if possible dont force the need to create additional files / classes unless desired by the developer for structuring purposes.