WO2015049697A1 - A gesture based system for translation and transliteration of input text and a method thereof - Google Patents

A gesture based system for translation and transliteration of input text and a method thereof Download PDF

Info

Publication number
WO2015049697A1
WO2015049697A1 PCT/IN2014/000623 IN2014000623W WO2015049697A1 WO 2015049697 A1 WO2015049697 A1 WO 2015049697A1 IN 2014000623 W IN2014000623 W IN 2014000623W WO 2015049697 A1 WO2015049697 A1 WO 2015049697A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
input text
text
user interface
electrical signals
Prior art date
Application number
PCT/IN2014/000623
Other languages
French (fr)
Inventor
Deshmukh Rakesh
Bangarambandi Sudhir
Dongre Akash
Padmanabhan Hariharan
Original Assignee
Deshmukh Rakesh
Bangarambandi Sudhir
Dongre Akash
Padmanabhan Hariharan
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Deshmukh Rakesh, Bangarambandi Sudhir, Dongre Akash, Padmanabhan Hariharan filed Critical Deshmukh Rakesh
Priority to KR1020187022878A priority Critical patent/KR101995741B1/en
Priority to SG11201602622QA priority patent/SG11201602622QA/en
Priority to EP14851325.2A priority patent/EP3053061A4/en
Priority to KR1020167011036A priority patent/KR20160071400A/en
Priority to RU2016115384A priority patent/RU2708357C2/en
Publication of WO2015049697A1 publication Critical patent/WO2015049697A1/en
Priority to IL244824A priority patent/IL244824B/en
Priority to PH12016500592A priority patent/PH12016500592A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/018Input/output arrangements for oriental characters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/40Processing or translation of natural language
    • G06F40/58Use of machine translation, e.g. for multi-lingual retrieval, for server-side translation for client devices or for real-time translation
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/03Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 characterised by the type of extracted parameters
    • G10L25/15Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 characterised by the type of extracted parameters the extracted parameters being formant information
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/48Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
    • G10L25/51Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Machine Translation (AREA)
  • Financial Or Insurance-Related Operations Such As Payment And Settlement (AREA)

Abstract

A gesture based system for translation and transliteration of input text, and a corresponding method have been disclosed. The system and method envisaged by the present disclosure provide for selective translation and transliteration of the input text, based on the gestures performed by a user. In accordance with the present disclosure, when the user performs a left swipe gesture, input text which is in a first language is translated into a language prescribed by the user. In the event that the user performs a right-swipe, the input text is transliterated from a first language, into a language prescribed by the user. The system and method envisaged by the present disclosure also enable the user to switch back to the original language of the input text, i.e„ the first language, by performing predetermined gestures.

Description

A GESTURE BASED SYSTEM FOR TRANSLATION AND TRANSLITERATION OF INPUT TEXT AND A METHOD THEREOF
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This patent application claims the priority of the Indian Provisional entitled "METHOD AND SYSTEM FOR REAL TIME LANGUAGE TRANSLATION AND TRANSLITERATION", with serial number. 3161/MUM/2013 filed at Government of India Patent Office on 4th October, 2013, the contents of which is incorporated by reference herein.
BACKGROUND
Technical Field
[0002] The present disclosure relates to the field of user interfaces. Particularly, the present disclosure relates to Indian language interfaces. More particularly, the present disclosure relates to user interfaces with translation and transliteration capabilities.
Description of the Related Art
[0003] In view of the recent technological advancements in the domain of communication, there has been a substantial increase in the number of people using mobile phones for the purpose of communicating with one other. Mobile phones available in the market today enable users to verbally communicate with one another, and also enable users to exchange text messages.
[0004] Further, some of the mobile phones also provide users with machine translation facilities. Typically, machine translation facilities are provided on mobile phones as separate, installable software applications. Further, a user also has the option of accessing the World Wide Web on his mobile phone to avail the machine translation facilities provided on the internet. [0005] However, one of the drawbacks associated with prior-art machine translation facilities and applications thereof is that these applications are not integrated with the operating system of the mobile phone and in order to make use of these facilities to get the input text translated, the user will have to manually copy the input text and feed the same to the translation application. The aforementioned process of availing machine translation on prior art translation facilities (applications) not only makes the translation process cumbersome and difficult to implement.
[0006] To alleviate the aforementioned problems and to simplify the entire process of translation/transliteration of input text, US Patent Application Publication US20070255554 proposed 'a system and method for providing automatic language translation in a telecommunications network'. However, the system envisaged by US20070255554 involved routing a communication call via a service center for automatic translation. The use of a third party service center for automatic translation means that firstly, the secrecy/confidentiality of the input text is compromised as it is being re-routed to a third-party service center, and secondly the user would not be able to compare the original text and the translated text since the actual translation takes place while the message is in transit. Further, the prior art systems do not provide the facility of transliterating from an Indie language to English language.
OBJECTS OF THE EMBODIMENTS
[0007] An object of the present disclosure is to provide a system that enables efficient and effective translation of input text.
[0008] Yet another object of the present disclosure is to provide a system that enables effective and efficient transliteration of the input text.
[0009] Still a further object of the present disclosure is to provide a system that enables translation and transliteration of the input text based on predetermined gestures performed by the user. [0010] One more objective of the present disclosure is to provide a system that is user friendly and easy to use. [0011] Yet another object of the present disclosure is to make available a system that provides for translation and transliteration of input text from a plurality of Indie languages to English and vice-versa. [0012] Still a further object of the present disclosure is to provide a system that enables a user to provide the input text in a native (Indie) language, and subsequently covert the input text into English before transmission of the input text as SMS/E-mail.
[0013] Yet another objective of the present disclosure is to provide a system that enables a user to receive a text string in English, and subsequently covert the received text string into any of the predetermined Indie languages.
SUMMARY
[0014] The present disclosure envisages a computer implemented system for selectively expressing an input text in a language other than the input language. The system envisaged by the present disclosure translates/transliterates the input text based on the gestures performed by a user. The system comprises a user interface configured to receive the text, as input from the user. [0015] The system further includes a recognition module cooperating with the user interface. The recognition module is configured to recognize the gestures performed by the user on the user interface, and convert predetermined gestures of the user into corresponding electrical signals. The electrical signals include instructions for expressing the input text in a language other than the input language. [0016] The system further includes a translation module cooperating with the recognition module to selectively receive the electrical signals, the translation module further configured to translate the input text subsequent to receiving the electrical signals from the recognition module. [0017] The system further includes a transliteration module cooperating with the recognition module to selectively receive the electrical signals, the transliteration module further configured to transliterate the input text subsequent to receiving the electrical signals from the recognition module. [0018] In accordance with the present disclosure, the user interface further cooperates with the translation module and transliteration module to receive translated text and transliterated text, the user interface still further configured to selectively display at least the input text, the translated text and the transliterated text, based on the predetermined gestures performed by the user. [0019] In accordance with the present disclosure, the recognition module is further configured to recognize at least a swipe left gesture and a swipe right gesture performed by the user, on the user interface. [0020] In accordance with the present disclosure, the recognition module is further configured to covert the swipe left gesture into electric signals for transliterating the input text provided by the user, the recognition module further configured to convert the swipe right gesture into electric signals for translating the input text provided by the user.
[0021] The present disclosure further envisages a computer implemented method for selectively expressing an input text in a language other than the input language, based on the gestures performed by a user. In accordance with the present disclosure, the computer implemented method comprises the following steps: receiving input text from the user, via a user interface; recognizing, using a recognition module, predetermined gestures performed by the user on the user interface; converting the predetermined gestures from the user into corresponding electrical signals, wherein the electrical signals comprise instructions for expressing the input text in a language other than the input language; selectively transmitting the electrical signals to a translation module and a transliteration module; translating the input text, subsequent to receiving the electrical signals from the recognition module; transliterating the input text, subsequent to receiving the electrical signals from the recognition module; and receiving translated text and transliterated text using the user interface; and selectively displaying the input text, the translated text and the transliterated text, on the user interface, based on the predetermined gestures performed by the user.
[0022] In accordance with the present disclosure, the step of recognizing predetermined gestures performed by the user on the user interface, further includes the step of recognizing at least a swipe left gesture and a swipe right gesture performed by the user, on the user interface.
[0023] In accordance with the present disclosure, the step of recognizing at least a swipe left gesture and a swipe right gesture, further includes the step of converting the swipe left gesture into electric signals for transliterating the input text provided by the user, and converting the swipe right gesture into electric signals for translating the input text provided by the user.
[0024] A non transitory computer readable medium having computer readable instructions stored thereupon, the computer readable instructions, when executed by a processor, cause a computer enabled device to: receive input text from the user, via a user interface; recognize, using a recognition module, predetermined gestures performed by the user on the user interface; convert the predetermined gestures from the user into corresponding electrical signals, wherein the electrical signals comprise instructions for expressing the input text in a language other than the input language; selectively transmit the electrical signals to a translation module and a transliteration module; translate the input text, subsequent to receiving the electrical signals from the recognition module; transliterate the input text, subsequent to receiving the electrical signals from the recognition module; and receive translated text and transliterated text using the user interface; and selectively display the input text, the translated text and the transliterated text, on the user interface, based on the predetermined gestures performed by the user.
[0025] In accordance with the present disclosure, the computer readable instructions are further configured to enable the computer enabled device to recognize at least a swipe left gesture and a swipe right gesture performed by the user, on the user interface.
[0026] In accordance with the present disclosure, the computer readable instructions are further configured to enable the computer enabled device to convert the swipe left gesture into electric signals for transliterating the input text provided by the user, and convert the swipe right gesture into electric signals for translating the input text provided by the user. BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS [0027] The present disclosure will be better understood from the following detailed description, with reference to the accompanying drawings, in which:
[0028] FIG.l illustrates a computer implemented system-level block diagram corresponding to a system for translation and transliteration of input text;
[0029] FIG.2 is a flow chart illustrating the steps involved in a computer- implemented method for translation and transliteration of input text;
[0030] FIGS. 3A and 3B depict an example user interface for translating input text; and [0031 ] FIGS. 4A and 4B depict an example user interface for transliterating input text. ·
DETAILED DESCRIPTION OF THE EMBODIMENTS HEREIN
[0032] The present disclosure envisages a computer implemented, gesture based system for selective translation and transliteration of input text. The translation and transliteration are selective in the sense that the translated/transliterated input text is displayed to a user, only in response to predetermined gestures performed by the user. The system envisaged by the present disclosure enables efficient and effective translation & transliteration of input text from English to a plurality of Indie languages and vice-versa. The system envisaged by the present disclosure enables a user to provide the input text in a native (Indie) language, and subsequently covert the input text into English before transmission of the input text as SMS/E-mail. Further, the system also enables the user to receive a text string in English, and subsequently covert the received text string into any of the predetermined Indie languages. [0033] Referring to the accompanying drawings, FIG.1 illustrates a computer implemented, gesture based system 100 for translation and transliteration of input text. The system 100, in accordance with the present disclosure includes a user interface 10. The user interface 10 is configured to, inter-alia, receive text as input from the user. The text input by the user can either be in English or in any of the predetermined Indie languages including but not restricted to Bengali, Guajarati, Assamese, Oriya, Kannada, Tamil, Telugu and Malayalam. Further, the user interface 10 is also configured to display text received in the form of emails/SMS/web pages and the like. [0034] The user interface 10, in accordance with the present disclosure is further configured to enable a user to communicate with the user interface 10 via a predetermined set of gestures. In a preferred embodiment, the user interface 10 being hosted on a touch sensitive/responsive display screen enables the user to interact by performing certain gestures including but not restricted to a left swipe gesture, right swipe gesture, tapping gesture, tap and hold gesture, circular gesture, semi-circular gesture, and double tap gesture. Further, the user interface 10 is further configured to enable the user to input text by the way of selectively tapping on the alphabet keys, alphanumeric keys, and the numeric keys displayed thereon.
[0035] The system 100, in accordance with the present disclosure further comprises a recognition module 12 cooperating with the user interface 10. The recognition module 12, in accordance with the present disclosure is configured to recognize and interpret the gestures performed by the user on the user interface 10. The recognition module 12 is configured to recognize the point of contact of the user's finger on the user interface 10 (using well known techniques including resistive technique, capacitive technique and surface acoustic wave technique) and accordingly interpret the 'touch' from the user as selection of corresponding characters/numerals/functions displayed in the form of selectable keys. [0036] Further, the characters/numerals whose corresponding keys are touched/selected by the user are considered as input text, by the recognition module 12. The term input text, in the context of the present disclosure is considered as a string of characters. Further, the 'input text' can also be construed as comprising a combination of alphabets and numerals. Further, the 'input text' can also be construed as a combination of alphabets, numerals and special characters. However, in a preferred embodiment of the present disclosure, for the sake of explanation, the input text is considered as a string of alphabets. [0037] In accordance with the present disclosure, the recognition module 12 is further configured to interpret predetermined gestures of the user as gestures indicative of requesting for translation and transliteration of the input text;
[0038] In accordance with the preferred embodiment of the present disclosure, a left swipe gesture performed by the user on the user interface 10 is interpreted by the recognition module 12 as a request for transliteration of the input text. In accordance with the preferred embodiment of the present disclosure, a swipe right gesture performed by the user on the user interface 10, is interpreted by the recognition module 12 as a request for translation of the input text. The interpretations performed by the recognition module 12 in response to swap left gesture and swap right gesture are hardcoded on to the recognition module 12, in accordance with the preferred embodiment of the present disclosure. However, it is with the scope of the present disclosure to substitute the swipe left and swipe right gestures with other appropriate gestures and hardcode the substituted gestures to be corresponding to translation and transliteration requests.
[0039] In accordance with the preferred embodiment of the present disclosure, the recognition module 12 cooperates with the user interface to identify a left swipe gesture and a right swipe gesture respectively. After the input text is keyed-in, if the user performs a left swipe on the user interface 10, the recognition module 12 triggers a transliteration module 16 and subsequently instructs the transliteration module 16 to transliterate the input text into a language chosen by the user. In the event that the user performs a right-swipe soon after entering the input text, the recognition module 12 triggers a translation module 14 and subsequently instructs the translation module 16 to translate the input text into a language chosen by the user. Even though, the functionalities of the translation module 14 and transliteration module 16 have been explained with reference to text input by the user, the translation module 14 and the transliteration module 16 are also configured to respectively translate and transliterate any text, for example, text received in the form of SMS/email/web-pages and the like, from the native language to the language prescribed by the user. [0040] Referring to FIG.3 A, there is shown an example user interface 10, using which the user has keyed-in the sentence 'how are you' in English. Subsequent to keying-in the input text, in the event that the user performs a swipe-left gesture, the transliteration module 16 is invoked and the English text 'how are you' is transliterated into any of the Indie languages selected by the user, as shown in the FIG.3B.
[0041] Referring to FIG.4A, there is shown another example user interface 10, using which the user has keyed-in the sentence 'how are you' in English. Subsequent to keying-in the input text, in the event that the user performs a swipe-right gesture, the translation module 14 is invoked and the English text 'how are you' is translated into any of the Indie languages selected by the user as shown in FIG. 4B.
[0042] In accordance with the present disclosure, the user interface 10 is configured to enable the user to firstly select a language for keying-in the input text. The language selected by the user could include but is not restricted to English, Hindi, Bengali, Gujarati, Assamese, Oriya, Urdu, Tamil, Telugu, Malayalam and Kannada. Further, the aforementioned non-exhaustive list of languages is also available for selection as the language to which the input text has to be translated and/or transliterated. Further, the recognition module 12 is configured to invoke the translation module 14 and the transliteration module 16 by the way of transmitting electric signals having instructions embedded thereupon for activating the translation module 14 and the transliteration module 16.
[0043] Further, in accordance with the present disclosure, the default screen of the user interface 10 is the screen used for keying-in the input text (as shown in FIG.3A), after the input text is keyed-in and in the event the user performs a left- swipe gesture, the user interface 10 would change to the view shown in FIG.3B, and in the event that the user performs a right swipe on the user interface 10 exemplified in FIG.3B, the user interface 10 would be restored to the default screen shown in FIG.3A.
[0044] Further, in accordance with the present disclosure, the default screen of the user interface 10 is the screen used for keying-in the input text (as shown in FIG.4A), after the input text is keyed-in and in the event the user performs a right- swipe gesture, the user interface 10 would change to the view shown in FIG.4B, and in the event that the user performs a left swipe on the user interface 10 exemplified in FIG.4B, the user interface 10 would be restored to the default screen shown in FIG.4A. [0045] Referring to FIG.2, there is shown a flow chart illustrating the steps involved in the method for translation and transliteration of input text. The method envisaged by the present disclosure is responsive to the gestures performed by the user. The method, in accordance with the present disclosure comprises the following steps:
receiving input text from the user, via a user interface (200);
recognizing, using a recognition module, predetermined gestures performed by the user on the user interface (202);
converting said predetermined gestures from the user into corresponding electrical signals, wherein said electrical signals comprise instructions for expressing the input text in a language other than the input language (204);
selectively transmitting said electrical signals to a translation module and a transliteration module (206);
translating the input text, subsequent to receiving said electrical signals from the recognition module (208);
transliterating the input text, subsequent to receiving said electrical signals from the recognition module (210); and
receiving translated text and transliterated text using the user interface; and selectively displaying the input text, the translated text and the transliterated text, on the user interface, based on the predetermined gestures performed by the user (212). [0046] In accordance with the present disclosure, the step of recognizing predetermined gestures performed by the user on the user interface, further includes the step of recognizing at least a swipe left gesture and a swipe right gesture performed by the user, on the user interface.
[0047] In accordance with the present disclosure, the step of recognizing at least a swipe left gesture and a swipe right gesture, further includes the step of converting the swipe left gesture into electric signals for transliterating the input text provided by the user, and converting the swipe right gesture into electric signals for translating the input text provided by the user.
[0048] Another embodiment of the present disclosure discloses a non transitory computer readable medium having computer readable instructions stored thereupon, the computer readable instructions, when executed by a processor, cause a computer enabled device to: receive input text from the user, via a user interface; recognize, using a recognition module, predetermined gestures performed by the user on the user interface; convert said predetermined gestures from the user into corresponding electrical signals, wherein said electrical signals comprise instructions for expressing the input text in a language other than the input language; selectively transmit said electrical signals to a translation module and a transliteration module; translate the input text, subsequent to receiving said electrical signals from the recognition module; transliterate the input text, subsequent to receiving said electrical signals from the recognition module; and receive translated text and transliterated text using the user interface; and selectively display the input text, the translated text and the transliterated text, on the user interface, based on the predetermined gestures performed by the user.
[0049] In accordance with the present disclosure, the computer readable instructions are further configured to enable the computer enabled device to recognize at least a swipe left gesture and a swipe right gesture performed by the user, on the user interface.
[0050] In accordance with the present disclosure, the computer readable instructions are further configured to enable the computer enabled device to convert the swipe left gesture into electric signals for transliterating the input text provided by the user, and convert the swipe right gesture into electric signals for translating the input text provided by the user.
TECHNICAL ADVANTAGES
[0051] The technical advantages envisaged by the present disclosure include the realization of a system that enables efficient and effective translation of input text. The system envisaged by the present disclosure also provides for effective and efficient transliteration of the input text. Further, the system envisaged by the present disclosure enables translation and transliteration of the input text based on predetermined gestures performed by the user. The system envisaged by the present disclosure is easy-to-use and user friendly. The system provides for translation and transliteration of input text from a plurality of Indie languages to English and vice- versa. The system envisaged by the present disclosure enables a user to provide the input text in a native (Indie) language, and subsequently covert the input text into English before transmission of the input text as SMS/E-mail. Further, the system also enables the user to receive a text string in English, and subsequently covert the received text string into any of the predetermined Indie languages.

Claims

CLAIMS We Claim:
1. A computer implemented system for selectively expressing an input text in a language other than the input language, based on the gestures performed by a user, said computer implemented system comprising: a user interface accessible to the user, said user interface configured to receive the text, as input from the user; a recognition module cooperating with the user interface, said recognition module configured to recognize the gestures performed by the user on the user interface, said recognition module further configured to convert predetermined gestures of the user into corresponding electrical signals, said electrical signals comprising instructions for expressing the input text in a language other than the input language; a translation module cooperating with the recognition module to selectively receive said electrical signals, said translation module further configured to translate the input text subsequent to receiving said electrical signals from the recognition module; and a transliteration module cooperating with the recognition module to selectively receive said electrical signals, said transliteration module further configured to transliterate the input text subsequent to receiving said electrical signals from the recognition module; said user interface further cooperating with said translation module and transliteration module to receive translated text and transliterated text, said user interface still further configured to selectively display the input text, the translated text and the transliterated text, based on the predetermined gestures performed by the user.
2. The system as claimed in claim 1, wherein said recognition module is further configured to recognize at least a swipe left gesture and a swipe right gesture performed by the user, on the user interface.
3. The system as claimed in claim 2, wherein said recognition module is further configured to covert the swipe left gesture into electric signals for transliterating the input text provided by the user, said recognition module further configured to convert the swipe right gesture into electric signals for translating the input text provided by the user.
4. A computer implemented method for selectively expressing an input text in a language other than the input language, based on the gestures performed by a user, said computer implemented method comprising the following steps: receiving input text from the user, via a user interface; recognizing, using a recognition module, predetermined gestures performed by the user on the user interface; converting said predetermined gestures from the user into corresponding electrical signals, wherein said electrical signals comprise instructions for expressing the input text in a language other than the input language; selectively transmitting said electrical signals to a translation module and a transliteration module; translating the input text, subsequent to receiving said electrical signals from the recognition module; transliterating the input text, subsequent to receiving said electrical signals from the recognition module; and receiving translated text and transliterated text using the user interface; and selectively displaying the input text, the translated text and the transliterated text, on the user interface, based on the predetermined gestures performed by the user.
5. The method as claimed in claim 4, wherein the step of recognizing predetermined gestures performed by the user on the user interface, further includes the step of recognizing at least a swipe left gesture and a swipe right gesture performed by the user, on the user interface.
6. The method as claimed in claim 5, wherein the step of recognizing at least a swipe left gesture and a swipe right gesture, further includes the step of converting the swipe left gesture into electric signals for transliterating the input text provided by the user, and converting the swipe right gesture into electric signals for translating the input text provided by the user.
7. A non transitory computer readable medium having computer readable instructions stored thereupon, said computer readable instructions, when executed by a processor, cause a computer enabled device to: receive input text from the user, via a user interface; recognize, using a recognition module, predetermined gestures performed by the user on the user interface; convert said predetermined gestures from the user into corresponding electrical signals, wherein said electrical signals comprise instructions for expressing the input text in a language other than the input language; selectively transmit said electrical signals to a translation module and a transliteration module; translate the input text, subsequent to receiving said electrical signals from the recognition module; transliterate the input text, subsequent to receiving said electrical signals from the recognition module; and receive translated text and transliterated text using the user interface; and selectively display the input text, the translated text and the transliterated text, on the user interface, based on the predetermined gestures performed by the user.
8. The non-transitory computer readable medium as claimed in claim 1 , wherein said computer readable instructions are further configured to enable the computer enabled device to recognize at least a swipe left gesture and a swipe right gesture performed by the user, on the user interface.
9. The non-transitory computer readable medium as claimed in claim 8, wherein said computer readable instructions are further configured to enable the computer enabled device to convert the swipe left gesture into electric signals for transliterating the input text provided by the user, and convert the swipe right gesture into electric signals for translating the input text provided by the user.
Figure imgf000025_0001
Date: 26-Sep-2014 Rakesh Prabhu Place: Bangalore Counsel for the Applicant
PCT/IN2014/000623 2013-10-04 2014-09-29 A gesture based system for translation and transliteration of input text and a method thereof WO2015049697A1 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
KR1020187022878A KR101995741B1 (en) 2013-10-04 2014-09-29 A gesture based system for translation and transliteration of input text and a method thereof
SG11201602622QA SG11201602622QA (en) 2013-10-04 2014-09-29 A gesture based system for translation and transliteration of input text and a method thereof
EP14851325.2A EP3053061A4 (en) 2013-10-04 2014-09-29 A gesture based system for translation and transliteration of input text and a method thereof
KR1020167011036A KR20160071400A (en) 2013-10-04 2014-09-29 A gesture based system for translation and transliteration of input text and a method thereof
RU2016115384A RU2708357C2 (en) 2013-10-04 2014-09-29 Gestures-controlled system for translation and transliteration of input text and method of input text translation and transliteration
IL244824A IL244824B (en) 2013-10-04 2016-03-30 A gesture based system for translation and transliteration of input text and a method thereof
PH12016500592A PH12016500592A1 (en) 2013-10-04 2016-04-01 A gesture based system for translation and transliteration of input text and a method thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN3161/MUM/2013 2013-10-04
IN3161MU2013 IN2013MU03161A (en) 2013-10-04 2014-09-29

Publications (1)

Publication Number Publication Date
WO2015049697A1 true WO2015049697A1 (en) 2015-04-09

Family

ID=52778327

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2014/000623 WO2015049697A1 (en) 2013-10-04 2014-09-29 A gesture based system for translation and transliteration of input text and a method thereof

Country Status (8)

Country Link
EP (1) EP3053061A4 (en)
KR (1) KR101995741B1 (en)
IL (1) IL244824B (en)
IN (1) IN2013MU03161A (en)
PH (1) PH12016500592A1 (en)
RU (1) RU2708357C2 (en)
SG (1) SG11201602622QA (en)
WO (1) WO2015049697A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200113522A (en) 2019-03-25 2020-10-07 삼성전자주식회사 Method for performing fucntion according to gesture input and electronic device performing thereof

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5432948A (en) * 1993-04-26 1995-07-11 Taligent, Inc. Object-oriented rule-based text input transliteration system
US20050043941A1 (en) 2003-08-21 2005-02-24 International Business Machines Corporation Method, apparatus, and program for transliteration of documents in various indian languages
US20070255554A1 (en) 2006-04-26 2007-11-01 Lucent Technologies Inc. Language translation service for text message communications
US20080097745A1 (en) 2006-10-18 2008-04-24 Domenica Bagnato Text analysis, transliteration and translation method and apparatus for hieroglypic, hieratic, and demotic texts from ancient egyptian
US20130151234A1 (en) 2011-12-12 2013-06-13 Google Inc. Techniques for input of a multi-character compound consonant or vowel and transliteration to another language using a touch computing device
US20130191108A1 (en) 2008-08-06 2013-07-25 Abbyy Software Ltd. Translation of a Selected Text Fragment of a Screen

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2146066C1 (en) * 1998-11-30 2000-02-27 Рудковский Александр Валентинович Foreign language learning tutorial
US7822596B2 (en) * 2005-12-05 2010-10-26 Microsoft Corporation Flexible display translation
CN102246126B (en) * 2008-12-15 2015-11-25 惠普开发有限公司 Based on the edit pattern of gesture
US9104312B2 (en) * 2010-03-12 2015-08-11 Nuance Communications, Inc. Multimodal text input system, such as for use with touch screens on mobile phones

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5432948A (en) * 1993-04-26 1995-07-11 Taligent, Inc. Object-oriented rule-based text input transliteration system
US20050043941A1 (en) 2003-08-21 2005-02-24 International Business Machines Corporation Method, apparatus, and program for transliteration of documents in various indian languages
US20070255554A1 (en) 2006-04-26 2007-11-01 Lucent Technologies Inc. Language translation service for text message communications
US20080097745A1 (en) 2006-10-18 2008-04-24 Domenica Bagnato Text analysis, transliteration and translation method and apparatus for hieroglypic, hieratic, and demotic texts from ancient egyptian
US20130191108A1 (en) 2008-08-06 2013-07-25 Abbyy Software Ltd. Translation of a Selected Text Fragment of a Screen
US20130151234A1 (en) 2011-12-12 2013-06-13 Google Inc. Techniques for input of a multi-character compound consonant or vowel and transliteration to another language using a touch computing device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3053061A4

Also Published As

Publication number Publication date
RU2016115384A3 (en) 2018-08-15
RU2016115384A (en) 2017-11-10
KR20180093100A (en) 2018-08-20
IN2013MU03161A (en) 2015-07-03
EP3053061A4 (en) 2017-04-19
PH12016500592A1 (en) 2016-06-13
SG11201602622QA (en) 2016-04-28
IL244824B (en) 2020-09-30
KR101995741B1 (en) 2019-07-03
EP3053061A1 (en) 2016-08-10
RU2708357C2 (en) 2019-12-05
IL244824A0 (en) 2016-05-31

Similar Documents

Publication Publication Date Title
US11190478B1 (en) Enhanced user interfaces and associated processes in email communication
KR102045585B1 (en) Adaptive input language switching
US9984072B2 (en) Method, apparatus, and system for providing translated content
US10394964B2 (en) Gesture based system for translation and transliteration of input text and a method thereof
US20150356304A1 (en) Device and method for activating security function for chatting region
KR20170060439A (en) Multilingual Interpretation and Translation Apparatus having Function of Automatic Language Setting and Method thereof
US9557818B2 (en) Contextually-specific automatic separators
CN104683963A (en) Information processing method and electronic equipment
US20150020012A1 (en) Electronic device and input method editor window adjustment method thereof
JP5791219B1 (en) Instant message transmission / reception program, information processing method, and information processing apparatus
US20210312899A1 (en) A system and method for multilingual conversion of text data to speech data
US20150309590A1 (en) Inputting method and associated electronic device
EP3053061A1 (en) A gesture based system for translation and transliteration of input text and a method thereof
US10346606B2 (en) Generation of a captcha on a handheld touch screen device
EP3105858A1 (en) Electronic device and method for extracting and using sematic entity in text message of electronic device
US20130344847A1 (en) Method and apparatus for processing memo while performing audio communication in mobile terminal
KR20110099963A (en) Method for processing message, terminal and system thereof
US10979582B2 (en) Extension of remote frame buffer (RFB) protocol
KR20160071400A (en) A gesture based system for translation and transliteration of input text and a method thereof
US10114904B2 (en) Method, system, electronic device and server for synchronous display of operating information
KR101618353B1 (en) Method for synchronizing server-side language and client-side language, and server-client system using the same
US20150161091A1 (en) Terminal device, method of controlling display of the terminal device, and communication system including the terminal device
KR101941463B1 (en) Method and apparatus for displaying a plurality of card object
CN106776059B (en) Text input method and system
Oktem et al. EDITING TEXT BASED ON INPUT MODE

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14851325

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 244824

Country of ref document: IL

WWE Wipo information: entry into national phase

Ref document number: 12016500592

Country of ref document: PH

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2014851325

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: IDP00201602607

Country of ref document: ID

Ref document number: 2014851325

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20167011036

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2016115384

Country of ref document: RU

Kind code of ref document: A