Skip to content

issues Search Results · repo:aws/sagemaker-sparkml-serving-container language:Java

Filter by

9 results
 (69 ms)

9 results

inaws/sagemaker-sparkml-serving-container (press backspace or delete to remove)

Hi We re looking to use Spark ml pipelines in realtime inference and deploying in Sagemaker. Assume this is our only option - that correct? I can still see it referenced by official Sagemaker docs eg. ...
  • wandrar
  • Opened 
    on Feb 25
  • #47

Hi, I am trying to build the docker image in this repository, and the build fails at - Step 6/20 : RUN apt -y install python3.6 Due to compliance and security restrictions at my organization, we cannot ...
  • mabel91
  • 1
  • Opened 
    on Sep 16, 2021
  • #25

Hi! How can this container be used for batch transfrom in sagemaker. Because my env variable holds schema for 561 columns which is very long. to pass as env variable. I was following this tutorial https://sagemaker-examples.readthedocs.io/en/latest/sagemaker-python-sdk/sparkml_serving_emr_mleap_abalone/sparkml_serving_emr_mleap_abalone.html ...
  • hafsahabib-educator
  • Opened 
    on Nov 1, 2020
  • #18

Hi, I was referring to this repo to build a custom docker image that does the feature transformation and use lightGBM model for prediction. I wanted to use mleap to serialize my feature transformation ...
  • prasadpande1990
  • Opened 
    on Sep 15, 2020
  • #17

After deploying mleap model artifact on sagemaker endpoint, during the inferencing stage, how can I get probabilities of prediction rather than binary outcome? I tried changing output column from prediction ...
  • janvekarnaveed
  • 2
  • Opened 
    on Jun 5, 2020
  • #13

Hello, I would like to understand why this limitation is in place. Presumably most machine learning models take in much more than 16 features. I created a model and had over 100 features. I tried to pass ...
  • rchazelle
  • 4
  • Opened 
    on May 16, 2020
  • #12

Hello, is there a plan to upgrade the sparkml serving containers to support pipeline models generated using Spark 2.4.3 and serialized using Mleap 0.14.0. Greatly appreciate if some could please share ...
  • sajjap
  • 1
  • Opened 
    on Nov 13, 2019
  • #10

I am trying to create Pipeline model that combines the SparkML model with the BlazingText model for a text classification task. The SparkML model is used to pre-process the input texts. I configured SparkML ...
  • vincentnqt
  • 5
  • Opened 
    on Aug 16, 2019
  • #6

I am trying to deploy bundled a Spark ML NaiveBayesModel with sagemaker-sparkml-serving-container. I am running sagemaker-sparkml-serving-container with following command: SCHEMA= { input :[{ name : ...
  • make
  • 7
  • Opened 
    on Nov 30, 2018
  • #3
Issue origami icon

Learn how you can use GitHub Issues to plan and track your work.

Save views for sprints, backlogs, teams, or releases. Rank, sort, and filter issues to suit the occasion. The possibilities are endless.Learn more about GitHub Issues
ProTip! 
Restrict your search to the title by using the in:title qualifier.
Issue origami icon

Learn how you can use GitHub Issues to plan and track your work.

Save views for sprints, backlogs, teams, or releases. Rank, sort, and filter issues to suit the occasion. The possibilities are endless.Learn more about GitHub Issues
ProTip! 
Press the
/
key to activate the search input again and adjust your query.
Issue search results · GitHub