•   almost 8 years ago

SPen SDK 2.0 source access

Is there any chance we can access the source code for the SDK? The document is a bit thin and as for the current status of SDK as others have indicated, a lot of work-a-round is needed to do a reasonable job for the challenge. SDK source is definitely helpful when we're out of options.

Best.

  • 15 comments

  •   •   almost 8 years ago

    I agree. It's not as if the SDK contains a lot of secret or highly-advanced stuff.

  •   •   almost 8 years ago

    I agree. It would be very useful to overcome the current limitations of the SDK. Especially, how to handle the events of the pen from the SDK and Android.

    Thanks in advance

  •   •   almost 8 years ago

    HI,

    I can sympathize with wanting to look under the hood but we can't release the source. I'm afraid that isn't our decision to make. Realistically it would take too long to arrange to be practical anyway. Sorry.

    Please do ask questions here and let us know what problems you're seeing and what features you'd like. Your input will help drive the direction of the SDK.

    Best,
    Hod Greeley
    Samsung Developers

  •   •   almost 8 years ago

    @Hod: Okay,

    1) Is there a way to fire an event when an object *IS* being selected?
    2) When a CanvasView is saved and loaded back, the original background is missing
    3) Any way to search a saved CanvasView with keywords in title, tags, descriptions?

  •   •   almost 8 years ago

    There is a lot missing from the SDK, and the documentation is very minimal. I think any substantial application would require writing a new library. Missing items include:
    1. Ability to modify strokes after being drawn by the user.
    2. Ability to restore and re order all object info.
    3. Ability to track and interact with object selection.
    4. Ability to select an object programmaticly by it's id (the IDs all seem to be -1).
    5. Clear documentation on how pen events flow including the button so they can be used outside of CanvasView. The events are there, but few would want to invest in a lot of programming w/o some guarantee that the API would be stable.
    6. Ability do easily disable the button double tap and hold functionality with-in an app so that the app can use the button.

    The S-pen SDK seems to have been written around a single application model. You ask people to be create new and exciting apps, but you have a limited model and refuse to provide any access to the information necessary to work "outside the box". If you really want to inspire some creative apps, don't make the developers have to reverse engineer all your stuff. Samsung makes its money on the hardware - you need the widest community of developers to facilitate that. It is your best interest to be as open as possible.

  •   •   almost 8 years ago

    Just a follow-on from my last post.

    I think the Galaxy Note is a fantastic phone. I also have a Galaxy Nexus and an Atrix 2, and I find the note to be by far the best. When it gets ICS it will be outstanding.

    The S-Pen is also nice, not only for drawing, but for many fine manipulation tasks. With it the Note really is the complete personal communications device: information rich, crystal clear, and fully functional, both for quick tasks and detailed work, and still fits in a pocket.

    It is that potential which really make me want to see some "out of the box" thinking for S-Pen apps. Let the developers run with this ball!

  •   •   almost 8 years ago

    Hi William,
    Thank you for your feedback. They will be provided to the technical teams.

  •   •   almost 8 years ago

    I just want to echo what others are saying. If you really want developers to create compelling apps please consider open sourcing the SDK.

    I think samsung should a cue from Google which open sources a lot of their projects. And we are talking about code that is infinitely more complex that the SDK - such as Chrome, V8, Dart and of course Android among others.

    Don't be afraid of releasing the source. You will have my respect if you do that :)

  •   •   almost 8 years ago

    Thanks Numan for your feedback and following closely to this App Challenge!

  •   •   almost 8 years ago

    I hope that you would forward this feedback to your product planning/marketing people as well. This issue, I believe, is a business strategy issue rather than a technical one.

  •   •   almost 8 years ago

    William, I started exploring the SDK yesterday and have the same questions that you raised about selection and manipulating obejctinfos.

    Have you been able to get any definitive answers on any of this? I looked around in the dev forums but wasn't able to find anything....

  •   •   almost 8 years ago

    I haven't gotten any answers.

    After spending some time with it, unless you are trying to recreate the S Memo application, the S-Pen SDK doesn't seem to offer much value add. The basic pen functions can be accessed from MotionEvents and KeyEvents. With the SDK's incomplete functionality, closed source, and obfuscated base classes, I'm more comfortable building on the core functionality rather than using the SDK. Then again, I am not trying to recreate S Memo.

  •   •   almost 8 years ago

    Hi,

    I wanted to answer the questions about the object manipulation. Those APIs are not 100% implemented. I apologize for the confusion. The APIs should have been hidden until they were complete. This feedback has been sent to the development group (and they agreed it was a mistake).

    You can subclass the Canvas classes and get the pen events, along with using the text recognition to easily allow a user to both draw and add text to an image. So you can both use the Canvas abilities and add other sophistication looking at the pen events directly.

    The example apps in the SDK give a good idea of the capabilities currently implemented. You can tell from the exposed APIs that we are already heading in the direction of many of the suggestions.

    Again, I apologize for the confusion. You can see that the SDK supplies some sophisticated capabilities, and will support even more sophisticated abilities in a flexible manner in future releases. Unfortunately these things take time. Rushing them would almost certainly create worse problems later.

    Best,
    Hod Greeley
    Samsung Developers

  •   •   almost 8 years ago

    Hod,

    Thanks for your update. I appreciate how difficult it can be to develop a comprehensive application framework.

    One question, you mention that "You can subclass the Canvas classes and get the pen events, along with using the text recognition". I haven't been able to find the APIs for text recognition. Can you point me in their direction?

    Thanks,
    William

  •   •   almost 8 years ago

    Hi William,

    The text recognition is built in. When a keyboard is brought up to take input, you can switch it over to accept handwriting and convert it. You can see this in one of the example projects.

    There are APIs that were accidentally exposed before being completed. One of these will allow reading out the resulting text. Unfortunately right now you can use the recognition to get text on a canvas, but can't read out the actual characters.

    Best,
    Hod Greeley
    Samsung Developers

Comments are closed.